Robots + bot policy
Robots.txt disallows, or bot-specific policies that block GPTBot / ClaudeBot. Fix by allowing the relevant user-agents where appropriate.
Crawl access issues are usually policy/config problems, not content problems.
Robots.txt disallows, or bot-specific policies that block GPTBot / ClaudeBot. Fix by allowing the relevant user-agents where appropriate.
Challenge pages, geo blocks, rate limits, or unusual 403/429 patterns. Fix by allowinglist, caching, or serving a stable HTML response.
Start by opening the canonical report and checking the blocked flag + reason. Then verify with a direct fetch from the bot user-agent if needed.
Use /radar/domains/DOMAIN?format=json for the machine-readable blocked reason and for automation gates.
After you fix access, use monitoring to prevent regressions (accidental re-blocking after security changes).