Stephen
Hackett-Delaney
Full Stack Software Engineer
Adding Google Search Console to Claude Code
February 11, 2026
This is Part 4 of my Agentic Workflows series. Part 1 covers the workflow pattern, Part 2 walks through package audits, Part 3 covers article generation.
I built an SEO audit workflow a few sessions ago. It checks meta tags, structured data, sitemaps, robots.txt — everything you can verify from code. But it had a blind spot: what does Google actually see?
You can have perfect meta tags and still not show up in search results. The only way to know is to ask Google directly. That's what Google Search Console does, and now Claude Code can query it.
The Gap
My SEO audit workflow (/workflows/seo-audit.md) was covering about 40% of what a professional tool like Screaming Frog would check. The missing pieces included live crawl data, indexing status, and search performance — all things that live in Google Search Console.
The question was: could I plug GSC directly into Claude Code so it becomes part of the audit?
MCP: The Bridge
Model Context Protocol (MCP) lets you give Claude Code access to external tools. Instead of me logging into the GSC web UI, copying data, and pasting it into the conversation, an MCP server exposes GSC's API as tools Claude can call directly.
I used mcp-server-gsc, which wraps the Google Search Console API into 8 tools:
| Tool | What It Does |
list_sites | Discover verified properties |
search_analytics | Clicks, impressions, CTR, position |
enhanced_search_analytics | Same, with 25K row limit and regex |
detect_quick_wins | Pages ranking 4–10 with low CTR |
index_inspect | Check if a URL is indexed |
list_sitemaps / get_sitemap | Sitemap status in GSC |
submit_sitemap | Submit a sitemap |
Setting It Up
1. Create a Google Service Account
You need a service account with Search Console API access:
- Go to Google Cloud Console
- Create a project (or use an existing one)
- Enable the Google Search Console API
- Create a Service Account under IAM & Admin
- Download the JSON key file
- In Google Search Console, add the service account's email as a Full user on your property
2. Configure the MCP Server
Create a .mcp.json in your project root:
{
"mcpServers": {
"gsc": {
"command": "npx",
"args": ["-y", "mcp-server-gsc"],
"env": {
"GOOGLE_APPLICATION_CREDENTIALS": "/path/to/your/service-account.json"
}
}
}
}Restart Claude Code, and the GSC tools appear automatically.
What I Found in 5 Minutes
Here's where it gets interesting. I ran list_sites to confirm the connection:
{
"siteEntry": [{
"siteUrl": "sc-domain:stephenhd.com",
"permissionLevel": "siteFullUser"
}]
}Working. Then I ran index_inspect on every public route:
| URL | Status | Last Crawled |
/ | Redirect → /about | Feb 9, 2026 |
/about | Indexed | Feb 9, 2026 |
/projects | Server error (5xx) | Nov 22, 2025 |
/articles | Unknown to Google | Never |
Three issues I didn't know I had:
/projectswas returning a 5xx to Googlebot. The page worked fine in my browser, but Google's last crawl attempt was back in November 2025 and it got a server error. It never retried./articlesdidn't exist in Google's index at all. I'd added the articles section recently but Google had never discovered it.- My sitemap wasn't registered in Google Search Console. Running
list_sitemapsreturned an empty response. The sitemap existed on the site —next-sitemapgenerates it on every build — but I'd never told Google about it.
I also pulled search analytics over a 90-day window:
{
"rows": [{
"keys": ["https://www.stephenhd.com/about"],
"clicks": 1,
"impressions": 8,
"ctr": 0.125,
"position": 15.875
}]
}One click. Eight impressions. For a portfolio site that's been live for months. The data confirmed what the indexing issues already suggested — Google barely knows this site exists.
The OAuth Scope Bug
I tried to fix the sitemap issue programmatically:
> submit_sitemap("https://www.stephenhd.com/sitemap.xml")
Error: 403 Insufficient PermissionMy service account has siteFullUser access in GSC, which should allow sitemap submission. So why the 403?
I traced it to the MCP server's source code. In dist/search-console.js:
this.auth = new google.auth.GoogleAuth({
keyFile: credentials,
scopes: ['https://www.googleapis.com/auth/webmasters.readonly'],
});The read-only OAuth scope is hardcoded. The package exposes a submit_sitemap tool but never requests the permission to use it. A classic case of the API wrapper not matching the API it wraps.
The fix is simple — change webmasters.readonly to webmasters — but it requires either patching the cached package, forking the repo, or contributing a PR upstream. For now, I submitted the sitemap manually in the GSC web UI.
This is worth knowing about when using MCP servers: property-level permissions and OAuth scopes are different things. Your account can have full access to a service, but if the MCP server only requests a read-only token, write operations will fail.
Integrating Into the Workflow
With the tools tested, I added a new Step 2i: Search Console Data to my SEO audit workflow. It runs after the code-based checks and covers:
Indexing status — Run index_inspect on every public route. Flag pages with server errors, "unknown to Google" status, or stale crawl dates.
Sitemap verification — Run list_sitemaps to confirm Google knows about the sitemap. If not, flag for manual submission.
Search performance — Run search_analytics with a 90-day window to see which pages get impressions. For high-traffic sites, detect_quick_wins surfaces pages ranking 4–10 with low CTR — easy optimization targets.
The workflow doc also includes interpretation guidance. An empty search_analytics response doesn't mean something is broken — it's normal for new or low-traffic sites. Focus on getting pages indexed first, then worry about ranking.
What I Learned
Code-based SEO checks are necessary but not sufficient. My meta tags, structured data, and sitemap config were all correct. But /projects was invisible to Google because of a transient server error three months ago, and /articles was invisible because the sitemap was never submitted. No amount of code auditing would have caught those.
MCP servers need scope auditing. Don't assume a tool that's exposed by an MCP server actually works. The submit_sitemap tool existed, appeared in the tool list, and accepted parameters — but silently lacked the OAuth scope to execute. Test every tool before relying on it.
Submitting your sitemap is not automatic. I assumed that having a sitemap.xml was enough. It's not. Google can discover it eventually through crawling, but explicitly submitting it in Search Console is faster and gives you visibility into processing status.
What's Next
The SEO audit workflow now covers ~50% of a professional audit, up from ~40%. The remaining gaps:
- Heading hierarchy and alt text — code checks that can be added to the existing workflow
- PageSpeed Insights API — free, no auth, easy to integrate via
curl - Fix the MCP OAuth scope — contribute a PR upstream or fork
- Content strategy — keyword research and content planning (separate workflow)
The pattern from this series keeps holding: build a workflow, extend it with an MCP, discover real issues, document it. Each iteration makes the system more capable.
This article was written by Claude Code using /workflows/article-builder.md.
© 2026. All rights reserved