Your robots.txt or WAF is blocking one or more AI crawlers from accessing your content. This means your pages won't appear in ChatGPT, Claude, Perplexity, or AI-powered search results. If both GPTBot and ClaudeBot are blocked, your AI Readiness Score is capped at 15 regardless of other factors.
This is the #1 AI visibility issue. Fix this before optimising anything else — no amount of good schema, structure, or rendering matters if crawlers can't access your pages.
All major AI crawlers respect robots.txt. Unlike rogue scrapers, blocking GPTBot/ClaudeBot/CCBot actually works — but it also means your content is excluded from AI training data, retrieval indexes, and real-time AI browsing. Once blocked, your content effectively doesn't exist for AI systems.
SEODiff checks your robots.txt for Disallow directives targeting 5 bot user-agents. The Crawler Health tool (Pro) also performs live HTTP requests with each bot's User-Agent to detect WAF/CAPTCHA challenges that block bots even without robots.txt rules.
User-agent: GPTBot / Disallow: / rules (often added by hosting platforms or CMS defaults)User-agent: * / Disallow: / that catch AI bots along with everything elseVisit https://yourdomain.com/robots.txt and look for rules targeting AI bots. Remove or modify any Disallow rules for these user-agents:
# Allow all AI crawlers User-agent: GPTBot Allow: / User-agent: ClaudeBot Allow: / User-agent: CCBot Allow: / User-agent: Google-Extended Allow: /
If your robots.txt is clean but SEODiff still reports blocks, your WAF may be serving challenge pages. Check:
Some SEO plugins (Yoast, Rank Math) can modify robots.txt. Check Settings → Reading → "Discourage search engines" is unchecked, then review your SEO plugin's robots.txt settings.
Shopify manages robots.txt automatically. If you've customised it via the robots.txt.liquid template, ensure you haven't added AI bot blocks.
Check your public/robots.txt or Next.js app/robots.ts configuration. Ensure AI bots are not in the Disallow list.