top of page

Beyond the A/B Test: The Four Measurement Frameworks Every Modern Marketer Needs

Most brands trust platform numbers. Here’s why that’s a problem — and the four-stage framework to fix it.

beyond the A/B Tests. the four measurement frameworks every modern marketer needs
beyond the A/B Tests. the four measurement frameworks every modern marketer needs

Most marketers I speak to can tell you their click-through rate. Some can tell you their ROAS. But ask them whether their campaigns are truly driving incremental revenue, and the room goes quiet.


The truth is, after nearly two decades in digital advertising, I’ve seen the same pattern play out repeatedly: brands invest heavily in media, trust what the platforms tell them, and build strategies on measurement foundations that have fundamental cracks in them.

The solution isn’t to pick one measurement methodology and defend it. The solution is to understand the hierarchy of four ways to measure impact but also know when to use each.


Stage 1: A/B Testing — The Entry Point

A/B testing is where most marketers begin their measurement journey, and for good reason. It’s intuitive, relatively straightforward, and most major platforms have it baked into their toolsets. You create two variants — an ad, a landing page, a call-to-action. You then split your audience, and measure the difference in performance.

creative testing
creative testing

Within its boundaries, A/B testing is genuinely powerful. It gives you directional, data-backed answers to tactical questions: which creative performs better? Which audience segment responds more favourably? Which bidding strategy generates stronger return?.

But here’s the problem. When you run an A/B test inside Meta or Google, you’re operating entirely within that platform’s ecosystem. The “winner” is the winner according to that platform’s attribution model, which is inherently self-serving. Platform-reported conversions bundle in view-through attributions, assisted conversions, and last-click credits that may have nothing to do with your ad actually influencing a purchase decision.


A/B testing is an excellent tool for creative and copy optimisation. It is a poor tool for measuring true business impact. Think of it as your “micro testing optimisation”


Stage 2: Incrementality Testing — Measuring What Actually Matters

If A/B testing asks “which variant is better?”, incrementality testing asks a fundamentally different question: “Would this person have converted anyway, without ever seeing my ad?”

This is the question that keeps CMOs up at night — and rightly so.

incrementality testing
incrementality testing

Incrementality testing typically works through holdout experiments. You take a proportion of your target audience and deliberately exclude them from seeing your advertising, additional funding or “new advertising platform”. You then compare the conversion rate or uplift of other key metrics of the exposed group against the unexposed holdout group. That difference, the genuine uplift or change, is your true incremental impact.


What brands consistently discover when running rigorous incrementality studies is eye opening. 


Platform-reported ROAS numbers are inflated, often significantly. In campaigns I’ve worked across, we’ve seen incrementality rates that suggest 30 to 40 per cent of platform-attributed conversions were people who would have converted regardless of the ad spend.


That’s not a small rounding error. That's wasted budget at scale.

Incrementality testing is the most honest measurement tool a brand can deploy at a channel level. It removes the flattery and tells you what’s real. The challenge is that it requires discipline. You have to willingly withhold advertising from a portion of your audience in the short term to understand your true efficiency over the long term. It also often requires large data sets and significant investment to make sure that the testing hits a certain confidence level to accept the testing as a successful test. 


Stage 3: Marketing Mix Modelling — The Strategic View

A/B testing and incrementality give you granular, channel-level insights. But they can’t tell you the full story of how all your marketing activity is working together across the entire business. That’s where Marketing Mix Modelling (MMM) comes in.


MMM is a statistical approach that uses aggregated data. Your weekly or monthly spend across all channels, your sales figures, seasonal indices, competitor activity, macroeconomic factors is modelled against the contribution of each element to overall revenue. It doesn’t rely on cookies or user-level tracking. It operates entirely at an aggregate level, which makes it both privacy-safe and uniquely suited to a world where signal loss is accelerating.


Unlike A/B testing or incrementality studies, MMM shows you the bigger picture: diminishing returns on individual channels, the halo effects of brand advertising on performance campaigns, the true payback period of your investment, and how external factors outside your control are influencing your results.


The limitation of MMM is its feedback loop. Good models take weeks to build and require months of consistent data to be reliable. They are not tools for in-flight optimisation which performance marketers often require. They are tools for strategic decision-making: annual budget allocation, channel mix shifts, and board-level investment justifications.


The resurgence of MMM over the past two years is no coincidence. As third-party cookies have eroded and platform attribution has become less reliable, the industry has returned to this more fundamental, privacy-first methodology. The brands that built MMM capability early are now operating with a significant competitive advantage.


Stage 4: Econometrics — The Foundation of It All

Econometrics rarely features in marketing conversations, which is a shame — because it underpins everything above. It is the foundations of your measurement framework that covers all aspects.

At its core, econometrics is the application of statistical and mathematical methods to understand causal relationships in economic data. MMM is applied econometrics. Good incrementality testing is built on econometric principles. Even a clean A/B test, at its most rigorous, is a basic econometric experiment.


Understanding econometrics helps you ask better questions of your data. It explains why correlation is not causation, why your attribution model is a proxy and not a truth, and why the assumptions baked into any statistical model matter enormously.


You don’t need to be an econometrician. But developing enough literacy in the principles, such as regression analysis, causal inference, controlling for confounding variables, will inevitably make you a significantly more critical consumer of the measurement outputs that your agencies, platforms, and data teams serve up.


The Four Stages Working Together

These methodologies aren’t competitors. They are complementary, and the most sophisticated advertisers in the world use all four in combination.


Use A/B testing for tactical creative and audience optimisation. Deploy incrementality testing to validate the true efficiency of individual channels. Build an MMM capability to inform strategic budget allocation. Develop enough econometric literacy to interrogate what you’re being told.


Each methodology illuminates a different part of the truth. None of them, on their own, gives you the complete picture. The skill and the competitive advantage you will gain lies in knowing which tool to reach for, and what question each one is actually equipped to answer for your business.


The brands that will win the next decade of advertising are not necessarily those with the biggest budgets. They are the ones who understand their numbers better than everyone else in the room.


 
 
 

Comments


bottom of page