Stephen

Hackett-Delaney

Full Stack Software Engineer

ArticlesAgentic Workflows

GA4 Event Tracking via the GTM API

workflowsautomationclaude-codeanalyticsgoogle-tag-manager

February 11, 2026

This is Part 5 of my Agentic Workflows series. Part 1 covers the workflow pattern, Part 2 walks through package audits, Part 3 covers article generation, Part 4 adds Google Search Console.


My site had Google Tag Manager installed but it wasn't doing anything useful. GA4 was collecting page views and generic scroll events — the defaults you get by existing. I couldn't answer basic questions like "how many visitors click my contact links?" or "which projects drive people to check out the live work?"

This session set up proper event tracking. The interesting part wasn't what we tracked — it was how we configured it.

Starting with Strategy, Not Tags

The first thing I built was a reusable workflow document (/workflows/google-analytics.md) that works across different site types — portfolio, e-commerce, SaaS, blog, agency, local business. The key insight: a business analysis step should drive event selection. Different sites track different things.

For a portfolio site, the business profile looks like this:

Site type: Portfolio (with blog)
Primary goal: Brand awareness + lead generation
Key conversions: Contact click, resume download
Key interactions: LinkedIn, email, outbound project clicks, article reading

That profile drives a lookup against an event matrix mapping site types to recommended events. The portfolio column gave me these:

EventTypeTrigger
contact_clickConversionLinkedIn or mailto link clicks
resume_downloadConversionClick on /cv.pdf
outbound_project_clickConversionExternal links on project pages
article_readEngagement60 seconds on an article page

Four events. No dataLayer.push() code needed — all capturable with GTM click and timer triggers.

The MCP Question

With the event plan approved, the next question was how to configure GTM. Three options:

Option 1: The GTM UI. Click through the web interface to create each trigger and tag manually. Works, but it's slow and can't be automated or reproduced.

Option 2: An MCP server. I found two third-party GTM MCP servers — Stape's (TypeScript, remote-hosted) and paolobietolini's (Go, self-hostable). Google has an official MCP for GA4 data queries, but nothing for GTM configuration.

Option 3: Direct API calls. The GTM API v2 is just REST. No MCP wrapper needed.

The Stape MCP routes your GTM API traffic through gtm-mcp.stape.ai — your container config, tag structures, and trigger conditions all pass through a third-party server. That's a non-starter for something that doesn't need to be remote. The Go server is self-hostable but adds Docker and a new language to the mix.

For a one-time setup of 4 triggers and 4 tags, direct API calls won. No dependencies, no third parties, and a clear picture of what's actually happening.

Direct API: The Setup

I already had a Google service account from integrating GSC in Part 4. Reusing it required two steps:

  1. Grant GTM access — add the service account email in GTM's User Management with Publish permission on the container
  2. Enable the Tag Manager API — one click in Google Cloud Console

Then Claude Code could mint an OAuth token from the service account and hit tagmanager.googleapis.com directly.

What Was Already There

Before creating anything, I queried the existing workspace:

Workspace 10 (Default):
  Tags:  google-analytics (GA4 Config, G-1JQYEMFCPY, All Pages) ✓
  Triggers: Page View (orphaned — no tags reference it) ✗
  Built-in Variables: Click URL, Click Text, Page Path, etc. ✓

The GA4 Config tag was correctly set up — measurement ID G-1JQYEMFCPY firing on every page. But there was an orphaned "Page View" trigger that nothing used. Cleaned that up.

Creating Triggers via the API

Each trigger is a POST request with a JSON body describing the conditions. Here's the contact links trigger — fire on link clicks where the URL matches LinkedIn or mailto:

{
  "name": "Click — Contact Links",
  "type": "linkClick",
  "autoEventFilter": [{
    "type": "matchRegex",
    "parameter": [
      { "key": "arg0", "value": "{{Click URL}}" },
      { "key": "arg1", "value": "(linkedin\\.com|mailto:)" }
    ]
  }]
}

The outbound project click trigger needed two conditions — external URL and on a project page:

{
  "name": "Click — Outbound Project Links",
  "type": "linkClick",
  "autoEventFilter": [{
    "type": "contains",
    "negate": true,
    "parameter": [
      { "key": "arg0", "value": "{{Click URL}}" },
      { "key": "arg1", "value": "stephenhd.com" }
    ]
  }],
  "filter": [{
    "type": "matchRegex",
    "parameter": [
      { "key": "arg0", "value": "{{Page Path}}" },
      { "key": "arg1", "value": "/projects/.*" }
    ]
  }]
}

The article read trigger uses a timer — fires once after 60 seconds, but only on article pages:

{
  "name": "Timer — Article Read 60s",
  "type": "timer",
  "interval": { "value": "60000" },
  "limit": { "value": "1" },
  "autoEventFilter": [{
    "type": "matchRegex",
    "parameter": [
      { "key": "arg0", "value": "{{Page Path}}" },
      { "key": "arg1", "value": "/articles/.+" }
    ]
  }]
}

API Quirks

The GTM API has some underdocumented behaviors:

No negative condition types. I initially tried "type": "doesNotContain" — the API returned a 400. The correct approach is "type": "contains" with "negate": true. The API reference doesn't list doesNotContain, doesNotEqual, etc. as valid condition types. You negate the positive ones.

autoEventFilter vs filter. For auto-event triggers (clicks, timers, forms), conditions on the event itself (like Click URL) go in autoEventFilter. Page-level conditions (like Page Path) go in filter. Get this wrong and the trigger either never fires or fires on every page.

Publishing requires a separate scope. Creating tags and triggers uses the tagmanager.edit.containers scope. But creating a version and publishing requires tagmanager.edit.containerversions and tagmanager.publish. I hit a 403 on the first publish attempt because I only had the edit scope.

No pageTitle built-in variable. I wanted to pass the page title as an event parameter for article_read. The enum of built-in variable types doesn't include it. The page path is sufficient — /articles/agentic-workflows-part-1 is descriptive enough.

Creating Tags

Each tag is a GA4 Event (gaawe type) that references a trigger and defines event parameters:

{
  "name": "GA4 — contact_click",
  "type": "gaawe",
  "parameter": [
    { "key": "eventName", "value": "contact_click" },
    { "key": "measurementIdOverride", "value": "G-1JQYEMFCPY" },
    { "key": "eventParameters", "list": [
      { "map": [
        { "key": "name", "value": "link_url" },
        { "key": "value", "value": "{{Click URL}}" }
      ]},
      { "map": [
        { "key": "name", "value": "link_text" },
        { "key": "value", "value": "{{Click Text}}" }
      ]}
    ]}
  ],
  "firingTriggerId": ["16"]
}

Four triggers created, four tags wired to them, orphaned trigger deleted, workspace published — all from the terminal. Zero GTM UI interaction.

Verification

GTM Preview Mode confirmed everything fired correctly:

TagStatus
google-analytics (GA4 Config)Fired 1 time
GA4 — contact_clickFired 1 time
GA4 — resume_downloadFired 1 time
GA4 — outbound_project_clickFired 1 time
GA4 — article_readNot Fired (correct — 60s timer hadn't elapsed)

Before and After

Before this session, GA4 tracked generic signals — page views, scroll depth, undifferentiated outbound clicks. You could see that someone clicked a link but not what kind of action it represented.

Before (generic)After (specific)Why it matters
click on any outbound linkcontact_clickSomeone wanted to reach me, not just clicked a random link
file_download on any fileresume_downloadSomeone downloaded my CV specifically
click on any outbound linkoutbound_project_clickWhich projects drive people to check out the live work
scroll (90% depth)article_read (60s on page)Time-on-page beats scroll depth as a reading signal

The three conversion events (contact_click, resume_download, outbound_project_click) can now be marked as key events in GA4. That turns "how many page views" into "what percentage of visitors take a meaningful action."

The Full Pipeline Vision

This session covered event tracking — one piece of a full analytics pipeline. Here's what the complete picture could look like:

LayerStatusTool
GTM installationAutomated@next/third-parties/google in code
Event strategyAutomatedWorkflow-driven business analysis
GTM tag/trigger configurationAutomated (this session)GTM API direct calls
Event verificationManualGTM Preview + GA4 DebugView
Mark key eventsManualGA4 Admin UI
Query analytics dataNot startedGoogle's official GA4 MCP (read-only)
Dashboard creationManualLooker Studio (no public API for creation)
GA4 admin operationsNot startedGA4 Admin API (same service account pattern)

Two gaps stand out:

GA4 data querying. Google ships an official GA4 MCP server that's read-only — perfect for pulling reports and answering questions like "which traffic sources bring visitors who actually convert?" Adding this would let Claude Code read both GSC (search performance) and GA4 (on-site behavior) in the same session.

Dashboard automation. Looker Studio is the obvious visualization tool — free, connects to GA4 natively, auto-refreshing. But it has no public creation API. Google's Looker MCP can build dashboards programmatically, but that's the enterprise Looker platform (paid), not the free Looker Studio. For now, dashboards remain a manual one-time setup.

What I Learned

MCP vs. direct API is a real tradeoff. An MCP handles auth lifecycle, makes tools discoverable, and persists sessions. Direct API calls have no dependencies, no third parties, and full transparency. For one-time configuration, direct calls win. For ongoing iteration across sessions, an MCP starts paying off.

Business analysis should drive event selection. It's tempting to start adding events — track everything, figure it out later. The workflow's business analysis step forces the question "what decisions will this data inform?" first. Four targeted events beat twenty generic ones.

The GTM API is capable but rough. Creating tags and triggers programmatically works well once you know the quirks (negation via negate: true, scope separation for publishing, undocumented enum values). But the documentation assumes you already know what autoEventFilter vs filter means. Expect to learn by error message.

Third-party MCP servers deserve scrutiny. The Stape GTM MCP is technically fine, but routing your container configuration through a third party is unnecessary when the API is this straightforward. Always check: does the MCP proxy through someone else's infrastructure? Does it need to?

What's Next

  • Add GA4 Admin API access to mark key events programmatically
  • Integrate the official GA4 MCP for report querying
  • Add auto/manual mode tagging to workflow conventions — some steps are fully automatable, some require UI interaction, and making that explicit helps
  • Explore the self-hosted GTM MCP (Go + Docker) as a learning exercise in MCP server architecture

The agentic workflows pattern keeps extending: build a workflow, connect it to an API, automate what you can, document the gaps. Each session pushes the automation boundary a little further.


This article was written by Claude Code using /workflows/article-builder.md.

© 2026. All rights reserved