An analytics-led studio for companies that want to make marketing decisions on evidence, not instinct.
01 / Diagnosis
What we keep finding
Most companies we work with describe the same situation in different words.
The dashboards exist. The data gets collected. The team works hard. Each channel looks reasonably healthy when checked on its own. But the question that actually matters — what is driving our growth? — has no clear answer. Decisions get made on instinct because the data underneath isn't trustworthy enough to act on.
A few years ago, this was solvable with better tools. It isn't anymore. Three things have shifted in ways most measurement systems weren't designed for.
Attribution is broken in ways the dashboards don't show. Privacy changes, ad platform restrictions, and the gap between server and client tracking mean that most reports are quietly missing or double-counting a meaningful share of what they claim to measure. Teams act on numbers that look precise and aren't.
Search has fragmented across surfaces that don't behave like Google did. AI Overviews, answer engines, and in-product assistants are answering questions before users ever reach a website. The traffic graph still moves up or down. But what shows up in analytics is increasingly decoupled from what's actually happening in the channels.
Content production has become close to free, which means content alone is no longer a moat. The companies that grow now are the ones with clear measurement, sharp distribution, and judgment about which AI workflows are useful and which are theater.
The dashboards still look the same. What they measure has quietly stopped working.
The studio exists for companies that have noticed some version of this and want to fix the underlying system, not just add another tool to it.
02 / Method
How the studio works
The studio works in three engagement shapes. Each starts with a clear scope, a fixed price, and senior delivery — no juniors, no retainers without a defined outcome.
-
Diagnostic
A focused audit of how your marketing measurement, search architecture, and AI-readiness actually work — and where the leaks are. Findings, prioritized recommendations, and a clear next step.
-
Build
Implementing what the Diagnostic surfaces. Measurement infrastructure, search architecture, content systems, AI workflows. Scoped after the Diagnostic, never before.
-
Advisory
Ongoing strategic counsel for founders and operators who need senior judgment available, not more headcount. Limited slots. By referral and selection.
03 / Selected work
Recent engagements
-
Recovering attribution after a stack migration
A B2B SaaS company had grown to $4M ARR on measurement that had quietly broken eighteen months earlier. The Diagnostic surfaced where reporting had drifted from reality. The rebuild took six weeks.
-
Why content was working but the dashboard said otherwise
Eighteen months of content was being measured by traffic, not by revenue contribution. The reframing changed the budget allocation by roughly 40% and clarified which formats were actually compounding.
-
Search architecture for the AI Overview era
A B2B services company watched organic clicks fall while brand mentions in AI-generated answers rose. Restructuring around answer-engine readiness recovered most of the practical value the click decline had cost.
04 / Notes
Recent thinking
-
The marketing measurement model most companies still operate on stopped working in 2023
A practitioner's account of where attribution is leaking and why most fixes don't address the underlying problem.
April 2026
-
What llms.txt is, and why we publish one
A short note on the emerging convention for AI-friendly sites, and what we put in ours.
March 2026
-
Why most attribution dashboards are mathematically incoherent
The hidden assumptions inside multi-touch attribution, and what to use when the math underneath doesn't hold up.
February 2026