🤖 SEO Testing with GitHub Copilot
Integrate SEODiff evaluation into GitHub Copilot agent workflows.
GitHub Copilot’s agent mode can call external APIs via MCP servers or custom instructions. Configure it to validate SEO quality on pages you’re editing — using SEODiff’s agentic evaluation endpoint.
System Prompt / Configuration
Copy this prompt and paste it into your IDE’s AI configuration:
# SEO Quality Instructions for GitHub Copilot
# Add to .github/copilot-instructions.md in your repository
## SEO Evaluation Workflow
When editing HTML pages, React components, or any template that generates web pages:
1. After making changes, evaluate SEO quality using the SEODiff API:
curl -X POST "https://seodiff.io/api/v1/agent/evaluate" \
-H "Authorization: Bearer $SEODIFF_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"urls": ["<page_url>"],
"assertions": [
{"rule": "has_h1"},
{"rule": "has_schema"},
{"rule": "no_placeholders"},
{"rule": "max_js_ghost_ratio", "value": 0.1}
],
"wait": true
}'
2. Fix any failing assertions before marking the task complete.
3. Target an ACRI score of 70+ for all pages.Setup
# Add to your repository's .github/copilot-instructions.md # Copilot will follow these instructions in agent mode
Example Interaction
User: "Fix the blog post template — it's not showing rich results." Copilot: 1. Evaluates current template: has_schema FAILED (no JSON-LD found) 2. Adds Article schema with headline, datePublished, author 3. Re-evaluates: All assertions pass, schema_coverage: 100% 4. "Added Article JSON-LD schema. ACRI improved from 52 to 71."
Assertion Rules to Use
The best assertion rules for AI agent workflows:
has_h1
Ensure every page has exactly one H1 heading tag.
has_schema
Ensure every page has valid JSON-LD schema markup for rich results.
no_placeholders
Find template variables like {{city}} or [TBD] that leaked into production HTML.
max_token_bloat
Detect when boilerplate overwhelms useful content for LLM crawlers.
max_js_ghost_ratio
Flag pages where content is rendered client-side and invisible to crawlers.
min_word_count
Prevent thin content by requiring a minimum number of words per page.