- Home
- GEO Factors
- Technical
- AI Bot Access
AI Bot Access
Checks robots.txt rules for AI crawlers (GPTBot, ClaudeBot, PerplexityBot, Google-Extended). Blocked bots can't index or cite your content.
Why It Matters for AI Visibility
How We Score It
How to Improve
- 1
Audit your robots.txt for AI-specific blocks
Open your robots.txt file (yourdomain.com/robots.txt) and search for User-agent entries targeting GPTBot, ChatGPT-User, Google-Extended, ClaudeBot, PerplexityBot, anthropic-ai, or Bytespider. Remove any Disallow rules under these agents. A single `Disallow: /` line under GPTBot costs you 2 points instantly.
- 2
Remove wildcard blocks that catch AI bots
A `User-agent: *` with `Disallow: /` blocks every bot, including all nine AI crawlers checked by the analyzer. If you use wildcard blocks, add explicit Allow rules for each AI bot, or replace the wildcard with targeted blocks only for bots you genuinely want to restrict.
- 3
Allow critical bots first if you must be selective
If you are uncomfortable allowing all AI crawlers, prioritize the five critical bots: GPTBot, ChatGPT-User, Google-Extended, Googlebot, and Bingbot. Allowing these five keeps your score at 8 or higher even if you block all non-critical bots.
- 4
Test your robots.txt after changes
Use Google's robots.txt tester or simply fetch yourdomain.com/robots.txt in a browser to verify your changes are live. Robots.txt is cached by crawlers, so changes may take hours to propagate. Confirm that each AI bot's User-agent section either does not exist or has `Allow: /`.
Before & After
User-agent: GPTBot Disallow: / User-agent: ChatGPT-User Disallow: / User-agent: Google-Extended Disallow: / User-agent: anthropic-ai Disallow: / User-agent: ClaudeBot Disallow: /
User-agent: GPTBot Allow: / User-agent: ChatGPT-User Allow: / User-agent: Google-Extended Allow: / User-agent: anthropic-ai Allow: / User-agent: ClaudeBot Allow: /
Code Examples
Robots.txt that allows all AI bots (recommended)
User-agent: GPTBot
Allow: /
User-agent: ChatGPT-User
Allow: /
User-agent: Google-Extended
Allow: /
User-agent: Googlebot
Allow: /
User-agent: Bingbot
Allow: /
User-agent: anthropic-ai
Allow: /
User-agent: ClaudeBot
Allow: /
User-agent: PerplexityBot
Allow: /
User-agent: Bytespider
Allow: /Selective access (allow critical bots, block others)
User-agent: GPTBot
Allow: /
User-agent: ChatGPT-User
Allow: /
User-agent: Google-Extended
Allow: /
User-agent: Googlebot
Allow: /
User-agent: Bingbot
Allow: /
User-agent: Bytespider
Disallow: /Frequently Asked Questions
What happens if I don't have a robots.txt file at all?
You get a perfect 10. No robots.txt means all bots are allowed by default. That said, consider adding one with explicit Allow rules for documentation purposes and to prevent future CMS updates or plugins from adding unintended blocks.
Which AI bots are most important to allow?
GPTBot, ChatGPT-User, Google-Extended, Googlebot, and Bingbot are the five critical bots. Each one costs 2 points if blocked, compared to 1 point for non-critical bots. These five cover ChatGPT, Google AI Overviews, and Bing Copilot -- the three largest AI answer engines.
Can I block some bots and still get a good score?
Yes. Blocking one or two non-critical bots (like Bytespider) only costs 1 point each, keeping you at 8 or 9. But blocking even one critical bot like GPTBot drops you to an 8, and blocking two critical bots puts you at 6, which is a partial score.
Check Your GEO Score
Run a free analysis on your website and see how you score across all 52 factors.
Analyze My Sitellms.txt File