Attribution, Incrementality, and Other Performance Myths

For years, digital marketing has measured performance using metrics that gave a false sense of control. Many attribution models and ROAS figures were accepted at the time without much scrutiny over their accuracy, even in large companies with surprisingly manual processes. But understanding incrementality—what we actually generate by making a change and wouldn’t have achieved by leaving things unchanged—is crucial to avoid confusing activity with real, measurable impact.

🕒 Reading time: 5 minutes

How to Measure Without Self-Deception

In marketing, we measure to understand, although too often we do it to confirm what we already want to believe.
The need to attribute results to specific channels or campaigns has turned measurement into an exercise of self-affirmation: if something works, we declare it “efficient”; if not, we write it off. In the process, we lose sight of what really matters: true causality and the actual incrementality of our actions.

The Mirage of Perfect Attribution

For years, attribution models promised something that sounds wonderful: knowing exactly which channel “convinced” the customer.

“Last-click was the first culprit: assigning all the credit to the last touchpoint is as convenient as it is misleading.”

Then came time-based, position-based, and algorithmic models. More sophisticated, yes, but equally illusory if we don’t understand that attribution doesn’t measure real impact—it only measures the observable path.

The problem is that the visible path is not always the real one. Human decisions are diffuse, influenced by prior stimuli, accumulated perceptions, and contexts that are difficult to trace.

Trying to reduce all of this to a linear model of “channels that push” is a dangerous simplification.

Incrementality as an Uncomfortable Truth

Beyond attribution, there is incrementality: what would have happened if we hadn’t done anything?

The question is simple, but the consequences of its answer can be striking. It forces us to accept that part of our conversions would have occurred anyway. For many marketing teams—or agencies—it is almost heretical.

Measuring incremental impact involves comparing real and counterfactual scenarios—through A/B tests or econometric models—to understand which part of the result we actually generated and which part we merely supported. It’s less glamorous than boasting about ROAS, but far more honest.

Performance Is Not Profitability

Another frequent mistake is equating performance with profitability.
A channel may appear highly “performant”—because it converts faster or cheaper—but still erode margin or cannibalize organic sales.
Misunderstood efficiency can distort decisions: we optimize what’s visible while ignoring what’s truly valuable.

“Sometimes the best performance comes from investing less in what already works by inertia and more in what opens new ground.”

But this requires a longer-term perspective focused on incremental value, not just operational metrics.

When Numbers Confirm Biases

The “clear and present danger” (to paraphrase Tom Clancy) isn’t measuring; it’s how we interpret what we measure.
When data is used to validate pre-made decisions, dashboards fill up with KPIs that comfort the team but reveal little about the real business.

A high CTR or low cost per lead isn’t a victory if we don’t understand what type of lead we’re generating, its impact on retention, or whether it contributes net margin. A data point without context is just a number.

The Challenge of Measuring Without Losing Perspective

Measurement is a matter of judgment. Attribution models help—and so do incremental tests—but none replaces critical thinking.
It’s not about finding the perfect model—there isn’t one—but about understanding what we want to learn and why.

And it’s worth remembering: many companies we’d call “large” would surprise you in how they handle certain tasks. From manually uploading spreadsheets to send an email, to working with measurement models riddled with deficiencies. Technological sophistication doesn’t guarantee analytical maturity or solve structural problems overnight.

The most important question isn’t “which channel converts the most?” but “which channel generates value that wouldn’t exist without it?

Measuring Incrementality

Identifying incrementality—the portion of revenue we wouldn’t have gained if we’d left things unchanged—is one of modern marketing’s greatest challenges. To do it rigorously, it’s best to combine experimental and observational analysis.

  • Holdout tests are a starting point: creating control groups that don’t receive the action (e.g., a paid campaign or an email) and comparing results against exposed groups isolates the true effect.
  • Additionally, econometric or marketing mix models (MMM) help estimate the marginal contribution of each channel in contexts where experimentation isn’t feasible, integrating historical data, seasonality, and external variables.
  • Finally, propensity or uplift models estimate the likelihood of conversion with and without a stimulus, offering a more granular view of impact.

Together, these methodologies don’t aim to show that everything works—they reveal what truly moves the needle. Precision is never guaranteed, but the approach reduces guesswork.

In Summary

For years, we accepted performance metrics—especially ROAS—as absolute truths. Over time, we’ve seen that many of these results lacked, at a minimum, precision and should have been questioned.

Measuring return is not the same as understanding impact, and many decisions were made based on data we would now reinterpret with more skepticism.

Attribution and contribution have long been assimilated. At the same time, we’ve always been concerned with incrementality, that “something” that was there. Like in medicine, the first step is always to diagnose and acknowledge the condition, not ignore it.

About the author

Oriol Guitart is a seasoned Business Advisor, Digital Business & Marketing Strategist, In-company Trainer, and Director of the Master in Digital Marketing & Innovation at IL3-Universitat de Barcelona.

Leave a Comment