What you'll get.
Two tools — DataForSEO's API (via MCP) and Claude Code — running in a terminal. Claude pulls the data, analyzes it, builds the report, creates tasks in Notion, and deploys the whole thing. Your job is to ask the right questions and review.
- Run a full SEO audit — keywords, competitors, on-page, AI search visibility — from your terminal
- Generate interactive HTML reports automatically
- Query live SERP data, keyword volumes, and competitor metrics on demand
- Test how your brand appears across ChatGPT, Claude, Gemini, and Perplexity (GEO audit)
- Push action items directly into a Notion task database
- Do all of this for ~$3–6 per audit instead of $130–500/month per tool seat
What you need.
- Claude Code — Anthropic's CLI tool (install guide)
- DataForSEO account — pay-per-use API, no subscription (dataforseo.com)
- Node.js — for the MCP server connection
- Notion account — optional, for task management integration
- A terminal — Mac, Linux, or Windows with WSL
The setup.
Install Claude Code
npm install -g @anthropic-ai/claude-code Run claude in your terminal and authenticate with your Anthropic account. That's it.
Get your DataForSEO credentials
Create an account at dataforseo.com. You get a username (your email) and an API password. DataForSEO is pay-per-use — no monthly fee. A full audit costs $1–3.
After signing up, go to the API Dashboard → API Access to find your credentials. Your email is the username; the API password is different from your login password.
Connect Claude Code to DataForSEO
Create a file called .mcp.json in your project directory:
{
"mcpServers": {
"dataforseo": {
"command": "npx",
"args": [
"-y",
"dataforseo-mcp-server"
],
"env": {
"DATAFORSEO_USERNAME": "your-email@example.com",
"DATAFORSEO_PASSWORD": "your-api-key-here"
}
}
}
} When you start Claude Code in this directory, it automatically connects to DataForSEO. Claude can now call any DataForSEO API endpoint as a tool.
Connect Notion for task management
This is where the workflow goes from audit tool to action system. Claude pushes findings directly into a Notion database as prioritized tasks.
4a · Create an internal Notion integration
- Go to notion.so/my-integrations and click New integration.
- Name it (e.g. "SEO Audit Bot") and select your workspace.
- Copy the Internal Integration Secret — this is your API key.
- In your Notion workspace, share the target database with this integration.
4b · Create a task database in Notion
| Property | Type | Purpose |
|---|---|---|
| Task | Title | The action item |
| Priority | Select | Critical / High / Medium / Low |
| Status | Select | To Do / In Progress / Done |
| Category | Select | SEO / Content / Technical / GEO |
| Source | Text | Which audit finding generated this |
| Search Volume | Number | Monthly searches for related keyword |
| Notes | Rich text | Claude's analysis and recommendation |
4c · Add Notion to Claude Code
Claude Code has native Notion MCP support — connect it through the Anthropic integration settings, or add it manually in .mcp.json. Once connected, Claude can search pages, create pages, and update databases directly. Just tell Claude "create a task in Notion" and it handles the API calls.
4d · Copy your database ID
Open your Notion database as a full page. The URL looks like:
notion.so/your-workspace/abc123def456...?v=... The abc123def456... part is your database ID. You'll give this to Claude when creating tasks.
Run your first audit.
Start Claude Code in your project directory:
claude Now just ask for what you need in plain language.
Keyword research
Claude calls dataforseo_labs_google_ranked_keywords and returns a formatted table with keywords, volumes, positions, and ranking URLs.
Competitor analysis
Claude calls dataforseo_labs_google_competitors_domain and gives you shared keywords, ETV, and competitive positioning.
On-page crawl
Claude calls on_page_instant_pages for a technical snapshot: TTFB, DOM size, content rate, missing elements, and recommendations.
AI search visibility (GEO audit)
This is something no traditional SEO tool offers:
Claude calls ai_optimization_llm_response for each model and compares: is the brand mentioned? Cited as a source? What position? You get a cross-platform visibility matrix.
Pro tip · Use real search queries
Don't make up queries like "best garden store in Sweden". Pull actual keyword data from DataForSEO first (keyword_ideas, keyword_suggestions), then test those real queries against AI models. The results are much more actionable.
Generate a report.
Claude writes a complete, self-contained HTML file with CSS, tables, and navigation. No dependencies — just open it in a browser.
You can deploy it instantly:
Claude creates the repo, pushes the file, enables Pages, and gives you a shareable URL. From "audit this site" to "here's a live report" in 10–15 minutes.
Push action items to Notion.
This is where the audit becomes operational. Instead of a PDF that sits in someone's inbox, the findings become trackable tasks:
Claude creates a Notion page for each action item with:
- Clear task title (e.g. "Write /tips-rad/ article for 'tåliga perenner' — 480 sök/mån")
- Priority based on impact (search volume × competitive gap)
- Category tag (SEO, Content, Technical, GEO)
- Detailed notes with Claude's analysis and specific recommendations
- Source reference back to the audit finding
The full loop — Audit → Report → Tasks → Track
Run the audit monthly. Each time, Claude checks which Notion tasks are done, measures impact (did rankings improve?), and creates new tasks based on fresh data. The audit isn't a one-time deliverable — it's a recurring workflow.
Full audit in 5 prompts.
Here's the exact prompt chain for a complete audit:
- Pull the top 50 ranked keywords for [domain] in Sweden. Show volume, position, and URL.
- Who are their top 10 organic competitors? Show keyword overlap.
- Crawl the homepage and top 3 landing pages. Check technical SEO.
- Test these 4 real search queries against ChatGPT, Claude, Gemini, and Perplexity: [queries from step 1]
- Create an HTML report, deploy to GitHub Pages, and push the top 10 actions to our Notion database.
Total time: ~15 minutes. Total cost: ~$3–6.
What it costs.
Real cost breakdown from a recent audit of blomsterlandet.se:
| API call | Cost |
|---|---|
| Ranked keywords (30 results) | $0.05 |
| Competitor domain analysis | $0.05 |
| On-page homepage crawl | $0.02 |
| ChatGPT scraper (4 queries) | $0.08 |
| Claude LLM response (2 queries) | $0.10 |
| Gemini LLM response (2 queries) | $0.14 |
| Perplexity LLM response (2 queries) | $0.04 |
| Total DataForSEO | ~$0.50 |
| Claude Code compute | ~$2–5 |
| Total per audit | ~$3–6 |
The new stack
The old stack
You'd need to run 700+ audits per year to match the cost of one SEMrush seat.
Better & worse than traditional tools.
What's better
1 · AI-native analysis, not just data
SEMrush shows you that a keyword ranks #15. Claude says: "hallon ranks #15 with 135k monthly searches. Here's a content brief, the competing pages, and what you need to write to move to top 5. Estimated traffic gain: +50,000 visits/month."
2 · GEO auditing
No traditional tool tests AI search visibility. We query ChatGPT, Claude, Gemini, and Perplexity in the same workflow. This is increasingly where buying decisions start.
3 · Custom reports in minutes
Instead of exporting CSVs and building decks, Claude generates complete interactive reports. Same data, fraction of the time.
4 · Audit → Notion tasks in one flow
The audit doesn't end at a PDF. Findings become trackable tasks with priority, category, and detailed notes. Next month, Claude checks what's done and creates new tasks.
5 · Everything is reproducible
Run the same audit next month with one prompt. Same process every time — no clicking through dashboards.
What's worse
Backlinks
DataForSEO has a backlinks API, but Ahrefs' crawler is still the gold standard. If backlink analysis is your core work, keep Ahrefs for that.
Historical tracking
No built-in rank tracking over time. You need your own pipeline (BigQuery + cron) to track positions weekly. Traditional tools do this out of the box.
No GUI
This is a terminal workflow. If you need to hand a tool to a non-technical team member, SEMrush's interface wins.
AI reliability
LLM outputs are non-deterministic. You need human review — this is an analyst tool, not an autopilot.
Who this is for.
- Agencies — doing SEO audits for multiple clients. Cost savings compound fast.
- In-house SEO teams — with technical capacity who want deeper analysis.
- Consultants — who want sophisticated reports without enterprise tooling.
- Anyone curious about GEO — this is the easiest way to test it.
Not for: teams that need point-and-click dashboards, anyone without terminal comfort, or use cases where historical rank tracking is the primary need.
Getting started.
- Install Claude Code —
npm install -g @anthropic-ai/claude-code - Create a DataForSEO account and get your API credentials.
- Create
.mcp.jsonin your project folder with the config above. - Optional — set up Notion integration for task management.
- Run
claudeand ask: "Pull the top 20 keywords for [your-domain]". - Ask Claude to build an HTML report and push tasks to Notion.