TechnicalFree

AI Bot Access

Checks robots.txt rules for AI crawlers (GPTBot, ClaudeBot, PerplexityBot, Google-Extended). Blocked bots can't index or cite your content.

Why It Matters for AI Visibility

AI-powered search engines can only cite your content if their crawlers can access it. ChatGPT uses GPTBot and ChatGPT-User to fetch pages before generating answers. Perplexity sends PerplexityBot to index sources it references in responses. Google AI Overviews rely on Googlebot and Google-Extended to gather content for synthesized answers. When your robots.txt blocks these bots, you become invisible to them. Your competitors who allow access get cited instead, even if your content is more accurate and comprehensive. This is not a theoretical risk -- many sites adopted blanket AI bot blocks in 2023-2024 without realizing they were opting out of the fastest-growing traffic channel in search. The impact is binary and immediate. A blocked bot cannot crawl your page, cannot index your content, and cannot cite you in AI-generated responses. Unlike traditional SEO where blocked pages may still appear through backlinks and cached content, AI engines respect robots.txt strictly. If you block GPTBot, ChatGPT will never reference your site in its answers.

How We Score It

Your score starts at 10 and drops based on which bots you block. We check 9 specific bots split into two tiers. Five are critical -- GPTBot, ChatGPT-User, Google-Extended, Googlebot, and Bingbot -- and blocking any of these costs 2 points each. Four are non-critical -- anthropic-ai, ClaudeBot, PerplexityBot, and Bytespider -- costing 1 point each if blocked. If your site has no robots.txt file at all, you score a perfect 10, since all bots are allowed by default. A score of 7 or higher passes. Scores of 4 to 6 indicate partial access with some important bots blocked. Scores of 0 to 3 mean most major AI crawlers are shut out.
See how your site scores on this factorAnalyze My Site

How to Improve

  • 1

    Audit your robots.txt for AI-specific blocks

    Open your robots.txt file (yourdomain.com/robots.txt) and search for User-agent entries targeting GPTBot, ChatGPT-User, Google-Extended, ClaudeBot, PerplexityBot, anthropic-ai, or Bytespider. Remove any Disallow rules under these agents. A single `Disallow: /` line under GPTBot costs you 2 points instantly.

  • 2

    Remove wildcard blocks that catch AI bots

    A `User-agent: *` with `Disallow: /` blocks every bot, including all nine AI crawlers checked by the analyzer. If you use wildcard blocks, add explicit Allow rules for each AI bot, or replace the wildcard with targeted blocks only for bots you genuinely want to restrict.

  • 3

    Allow critical bots first if you must be selective

    If you are uncomfortable allowing all AI crawlers, prioritize the five critical bots: GPTBot, ChatGPT-User, Google-Extended, Googlebot, and Bingbot. Allowing these five keeps your score at 8 or higher even if you block all non-critical bots.

  • 4

    Test your robots.txt after changes

    Use Google's robots.txt tester or simply fetch yourdomain.com/robots.txt in a browser to verify your changes are live. Robots.txt is cached by crawlers, so changes may take hours to propagate. Confirm that each AI bot's User-agent section either does not exist or has `Allow: /`.

Before & After

Before
User-agent: GPTBot
Disallow: /

User-agent: ChatGPT-User
Disallow: /

User-agent: Google-Extended
Disallow: /

User-agent: anthropic-ai
Disallow: /

User-agent: ClaudeBot
Disallow: /
After
User-agent: GPTBot
Allow: /

User-agent: ChatGPT-User
Allow: /

User-agent: Google-Extended
Allow: /

User-agent: anthropic-ai
Allow: /

User-agent: ClaudeBot
Allow: /

Code Examples

Robots.txt that allows all AI bots (recommended)

User-agent: GPTBot
Allow: /

User-agent: ChatGPT-User
Allow: /

User-agent: Google-Extended
Allow: /

User-agent: Googlebot
Allow: /

User-agent: Bingbot
Allow: /

User-agent: anthropic-ai
Allow: /

User-agent: ClaudeBot
Allow: /

User-agent: PerplexityBot
Allow: /

User-agent: Bytespider
Allow: /

Selective access (allow critical bots, block others)

User-agent: GPTBot
Allow: /

User-agent: ChatGPT-User
Allow: /

User-agent: Google-Extended
Allow: /

User-agent: Googlebot
Allow: /

User-agent: Bingbot
Allow: /

User-agent: Bytespider
Disallow: /

Frequently Asked Questions

What happens if I don't have a robots.txt file at all?

You get a perfect 10. No robots.txt means all bots are allowed by default. That said, consider adding one with explicit Allow rules for documentation purposes and to prevent future CMS updates or plugins from adding unintended blocks.

Which AI bots are most important to allow?

GPTBot, ChatGPT-User, Google-Extended, Googlebot, and Bingbot are the five critical bots. Each one costs 2 points if blocked, compared to 1 point for non-critical bots. These five cover ChatGPT, Google AI Overviews, and Bing Copilot -- the three largest AI answer engines.

Can I block some bots and still get a good score?

Yes. Blocking one or two non-critical bots (like Bytespider) only costs 1 point each, keeping you at 8 or 9. But blocking even one critical bot like GPTBot drops you to an 8, and blocking two critical bots puts you at 6, which is a partial score.

Related Factors

Check Your GEO Score

Run a free analysis on your website and see how you score across all 52 factors.

Analyze My Site