🐙 SEO Regression Testing in GitHub Actions
Block bad deployments with automated SEO checks in your GitHub Actions CI/CD pipeline.
GitHub Actions is the most popular CI/CD platform for open-source and startup teams. SEODiff’s /validate endpoint integrates in a single workflow step — no marketplace action needed.
Configuration
Copy this configuration into your project:
name: SEO Regression Check
on:
pull_request:
branches: [main]
jobs:
seo-check:
runs-on: ubuntu-latest
steps:
- name: Wait for preview deployment
# Replace with your deployment step/wait logic
run: echo "Preview at $PREVIEW_URL"
- name: SEODiff Validate
run: |
RESULT=$(curl -s -w "\n%{http_code}" \
-X POST "https://seodiff.io/api/v1/validate" \
-H "Authorization: Bearer ${{ secrets.SEODIFF_API_KEY }}" \
-H "Content-Type: application/json" \
-d "{\"url\": \"$PREVIEW_URL\", \"wait\": true}")
HTTP_CODE=$(echo "$RESULT" | tail -1)
BODY=$(echo "$RESULT" | head -n -1)
echo "$BODY" | jq .
if [ "$HTTP_CODE" -eq 409 ]; then
echo "::error::SEO regression detected!"
exit 1
fi
- name: Evaluate pSEO pages
run: |
curl -sf -X POST "https://seodiff.io/api/v1/agent/evaluate" \
-H "Authorization: Bearer ${{ secrets.SEODIFF_API_KEY }}" \
-H "Content-Type: application/json" \
-d '{
"urls": ["'"$PREVIEW_URL"'/city/new-york", "'"$PREVIEW_URL"'/city/los-angeles"],
"assertions": [
{"rule": "has_h1"},
{"rule": "no_placeholders"},
{"rule": "min_word_count", "value": 300},
{"rule": "has_schema"}
],
"wait": true
}' | jq .Setup Steps
Add API key to repository secrets
Go to Settings → Secrets → Actions → New secret. Name it SEODIFF_API_KEY.
Add the workflow file
Create .github/workflows/seo-check.yml with the config above.
Open a pull request
The workflow runs automatically. SEODiff returns 200 (pass) or 409 (fail) — GitHub blocks the merge if SEO regressions are found.
What Gets Checked
The /validate endpoint runs a surface scan and checks for SEO regressions. For deeper testing, combine it with the /agent/evaluate endpoint to add custom assertion rules:
no_placeholders
Find template variables like {{city}} or [TBD] that leaked into production HTML.
max_token_bloat
Detect when boilerplate overwhelms useful content for LLM crawlers.
has_schema
Ensure every page has valid JSON-LD schema markup for rich results.
min_schema_count
Require a minimum number of JSON-LD schema blocks per page.