Fix: Blocked AI Crawlers

How to unblock GPTBot, ClaudeBot, CCBot, and Google-Extended so your content appears in AI answers.

What this means

Your robots.txt or WAF is blocking one or more AI crawlers from accessing your content. This means your pages won't appear in ChatGPT, Claude, Perplexity, or AI-powered search results. If both GPTBot and ClaudeBot are blocked, your AI Readiness Score is capped at 15 regardless of other factors.

This is the #1 AI visibility issue. Fix this before optimising anything else — no amount of good schema, structure, or rendering matters if crawlers can't access your pages.

Why it matters for AI visibility

All major AI crawlers respect robots.txt. Unlike rogue scrapers, blocking GPTBot/ClaudeBot/CCBot actually works — but it also means your content is excluded from AI training data, retrieval indexes, and real-time AI browsing. Once blocked, your content effectively doesn't exist for AI systems.

How SEODiff detects it

SEODiff checks your robots.txt for Disallow directives targeting 5 bot user-agents. The Crawler Health tool (Pro) also performs live HTTP requests with each bot's User-Agent to detect WAF/CAPTCHA challenges that block bots even without robots.txt rules.

Common causes

How to fix it

Step 1: Check your robots.txt

Visit https://yourdomain.com/robots.txt and look for rules targeting AI bots. Remove or modify any Disallow rules for these user-agents:

# Allow all AI crawlers
User-agent: GPTBot
Allow: /

User-agent: ClaudeBot
Allow: /

User-agent: CCBot
Allow: /

User-agent: Google-Extended
Allow: /

Step 2: Check WAF/bot management

If your robots.txt is clean but SEODiff still reports blocks, your WAF may be serving challenge pages. Check:

WordPress

Some SEO plugins (Yoast, Rank Math) can modify robots.txt. Check Settings → Reading → "Discourage search engines" is unchecked, then review your SEO plugin's robots.txt settings.

Shopify

Shopify manages robots.txt automatically. If you've customised it via the robots.txt.liquid template, ensure you haven't added AI bot blocks.

Next.js / Vercel

Check your public/robots.txt or Next.js app/robots.ts configuration. Ensure AI bots are not in the Disallow list.

How to validate the fix

  1. Run the Crawler Health tool to verify access for all 5 bots.
  2. Re-run your AI Readiness Scan — the Bot Access Score should improve immediately.
  3. The AI Readiness Score cap should be lifted (no longer stuck at 10 or 15).