Detect Accidental Noindex Tags
Ensure pages are indexable and don't have stray noindex directives.
The Problem
A single <meta name="robots" content="noindex"> silently removes a page from Google. For pSEO sites, a template-level noindex tag (from staging, a CMS default, or a mismerged branch) can deindex thousands of pages in hours. This assertion catches it before deployment.
The Hard Way
View source and search for “noindex”. Also check HTTP headers (X-Robots-Tag). For pSEO at scale, you need a crawler that checks both locations across all pages.
The SEODiff Way
One API call. Results in under 2 seconds.
POST https://seodiff.io/api/v1/agent/evaluate
{"urls": ["https://example.com/product/widget-a"], "assertions": [{"rule": "no_noindex"}]}Code Examples
Copy-paste examples in your preferred language:
cURL
See the full evaluation example in cURL →
Python
See the full evaluation example in Python →
Node.js
See the full evaluation example in Node.js →
Go
See the full evaluation example in Go →
PHP
See the full evaluation example in PHP →
Related Assertions
status_code
Ensure pages return the expected HTTP status code.
has_h1
Ensure every page has exactly one H1 heading tag.
has_meta_description
Ensure every page has a non-empty meta description tag.
Use in CI/CD
Add this assertion to your deployment pipeline. Works with any CI platform:
🐙 GitHub Actions
Block bad deployments with automated SEO checks in your GitHub Actions CI/CD pipeline.
🦊 GitLab CI
Add automated SEO quality gates to your GitLab CI/CD pipelines.
▲ Vercel
Automatically validate SEO on every Vercel preview deployment before promoting to production.