xeve.io had been live for weeks. The landing page was converting, the dashboard worked, Google had indexed 60+ pages. SEO was "fine." But "fine" is not a number, and I had no idea what I was missing.
Then I found claude-seo — an open-source skill package by Daniel Agrici that adds 13 SEO audit commands to Claude Code. I installed it, pointed it at my site, and the results were humbling.
The Setup: SEO as Code
claude-seo is a collection of Markdown skill files that install into ~/.claude/skills/. Each skill defines a specific audit domain — what to check, what severity to assign, what output format to use. Installation is one command:
git clone --depth 1 https://github.com/AgriciDaniel/claude-seo.git
bash claude-seo/install.sh
After that, you get slash commands like /seo-technical, /seo-schema, /seo-content, and more. No external API. No SaaS dashboard. No monthly fee. The AI reads your actual source code, fetches your live pages, checks your HTTP headers, validates your structured data, and cross-references everything. It is opinionated — it knows what a CSP header should look like, what touch target sizes Google requires, what schema.org types are valid for your page type.
The key insight behind claude-seo is that Claude Code can already do everything a technical SEO tool does. It can read robots.txt, parse XML sitemaps, validate JSON-LD, check HTTP response headers, grep for meta tags, and analyze JavaScript rendering patterns. The skills just tell it what to look for and how to score it.
The Audit: 81/100
I ran /seo-technical and Claude spawned 5 parallel agents, each covering a different audit category. They ran simultaneously — one checking crawlability, one checking security headers, one analyzing URL structure and JS rendering, one measuring Core Web Vitals readiness, one validating structured data. Total wall time: about 2 minutes.
The report came back with a score of 81/100 across 9 categories:
- JS Rendering: 98/100 — all public pages server-rendered, zero SEO content behind client JS. This was validating — it meant the Next.js App Router architecture was doing its job.
- URL Structure: 95/100 — clean URLs, logical hierarchy, consistent trailing slashes.
- Structured Data: 92/100 — 35 JSON-LD blocks across 19 files, but missing
availabilityon download page schemas. - Mobile: 70/100 — this was the surprise. Multiple touch targets below Google's 48x48px minimum. The hamburger button was 24x24px. Footer links were 11px text with 8px spacing.
- Security: 78/100 — no Content-Security-Policy header at all. The site was technically vulnerable to XSS.
What It Found: The Hits
Some findings were things I would never have caught manually:
1. Every image on the site used raw <img> tags. Fifteen instances across 13 files. No next/image, no automatic WebP/AVIF conversion, no responsive srcset, no width/height attributes for CLS prevention. I had been building the dashboard for months and just never thought about it — the images were small avatars and album art, easy to overlook.
2. The live robots.txt was missing 4 rules. I had added blocking rules for CCBot, anthropic-ai, and cohere-ai in the source code, but never redeployed. Training crawlers had been scraping the site unrestricted for days while I thought they were blocked.
3. The Reddit Pixel was importing the entire Supabase SDK on every page. It did this to check if the current user had an email for advanced matching. On the landing page — where users are not logged in — this was pure waste. An unnecessary dynamic import on every single page load.
4. The Inter font was imported but never used. The design system uses monospace everywhere (JetBrains Mono). Inter was loaded, adding to the font download, but zero elements referenced it. A leftover from an earlier design iteration.
5. An internal IP address was exposed in client-side code. A placeholder in a form field contained http://157.245.104.156:8140 — a real server IP visible to anyone who inspected the page source.
The Fix: One Session, 20 Files
After the audit, I told Claude to fix all high and low priority issues. It made 20 file changes in one pass:
- Added a Content-Security-Policy header to
next.config.tswith whitelisted sources for scripts (Google Analytics, Reddit Pixel), images (Supabase, Spotify, GitHub), and connections. - Replaced all 11
<img>tags withnext/imageacross 10 files, adding explicit width/height props and theavatars.githubusercontent.comdomain to remotePatterns. - Fixed touch targets — hamburger button to 48x48px, theme toggle to 48x48px on mobile, mobile nav links with proper vertical padding.
- Added
availabilityto all three download page schemas and added the missingdownloadUrlto the iOS page. - Added BreadcrumbList to the integrations index (the only index page missing it).
- Removed the unused Inter font import.
- Expanded the Permissions-Policy to also deny payment, USB, and display-capture APIs.
- Deferred the Reddit Pixel's Supabase import to only load on dashboard pages where the user is actually authenticated.
- Replaced the internal IP with a generic placeholder.
Build passed. Deployed. The robots.txt and sitemap synced to production in the same push.
What I Learned About SEO Tooling
Traditional SEO tools scan the output. This scans the source. Tools like Ahrefs and Screaming Frog crawl your rendered pages — they see what Googlebot sees. That is valuable. But they cannot tell you that your Reddit Pixel is importing the Supabase SDK unnecessarily, or that your robots.ts source has rules that are not live, or that your ThemeToggle component has a 24px tap target. Source-level analysis catches a different class of issues.
The audit-to-fix loop matters more than the audit. Getting a report that says "no CSP header" is easy. Getting the CSP header written correctly — with the right sources whitelisted for your specific stack (Google Analytics, Supabase, Reddit, Spotify CDN) — requires understanding the codebase. Because Claude Code already has full context of the project, the fix is informed by the actual code, not a generic recommendation.
Parallel agents make comprehensive audits fast. Running 5 specialists in parallel (crawlability, security, performance, URL structure, structured data) takes 2 minutes instead of 10. Each agent focuses on its domain and returns a structured report. The parent agent merges them into a unified scorecard.
The 80/20 of technical SEO is boring. The highest-impact fixes were not clever. They were: add a security header that should have been there from day one, use the image component that Next.js provides, make buttons big enough to tap on a phone. None of this is novel. But it was not done, and it would have stayed undone without a systematic audit.
The Tool: claude-seo
claude-seo by Daniel Agrici is open source (MIT license, 2.8k stars) and includes 13 skills with 7 specialized subagents. The commands I used on xeve.io:
/seo-technical— full 9-category technical audit (crawlability, security, Core Web Vitals, JS rendering, etc.)/seo-schema— detect, validate, and generate Schema.org structured data/seo-content— E-E-A-T analysis, readability, thin content detection/seo-geo— AI Overviews, ChatGPT search, Perplexity optimization/seo-sitemap— sitemap validation and generation/seo-hreflang— international SEO and hreflang validation/seo-audit— full site audit that delegates to all specialists
It also has skills for image optimization, programmatic SEO, competitor comparison page generation, and strategic planning with industry-specific templates. There is an optional DataForSEO MCP integration for SERP data if you want it, but the core audit runs with zero external dependencies.
No API keys required for the base functionality. Just your codebase, Claude Code, and a one-line install. If you are building with Next.js (or any framework), the AI already has access to everything it needs — your routes, your meta tags, your structured data, your config files, your HTTP headers. The skills just point it in the right direction.
The xeve.io technical SEO score went from 81 to an estimated 93 in one session. Twenty files changed, zero regressions, build passed on the first try. That is the workflow I wanted — audit, fix, ship, move on.