Resources

Words of Wisdom: The Future of Marketing Measurement

October 30, 2023

Take it from the experts who have lived and operated through multiple marketing evolutions. The erosion of data quality today is jeopardising the accuracy of your entire MarTech stack, including the measurement models underpinning them.

Chloe from RescueMetrics sat down with the data jedis, Phil Zohrab and Alex Langshur, to riff about the future of marketing measurement. The unique insights shared in this call were too good to keep to ourselves.

During the call they reflect on measurement models of the past and the limitations of MMM (Marketing Mixed Models). Fast forward to digital attribution’s single touch and multi-touch models, the guys begin to talk about the state of signal loss today.

Combine cookie deprecation with the rapid uptake of adblockers and aggressive new browsers, and you get the perfect storm for discrepant analytics. With consented data facing the same erosion, what does this mean for the models and algorithms that digital marketers trust with their budgets? Put frankly, it’s bleak. But Phil and Alex were teeming with optimism over the new evolutions underway spurred by generative AI and consented data protection.

Intro the guys

It’s a small world out there and the marketing community is no exception. Alex and Phil sport different accents but were once simultaneously active in the Dentsu ecosystem. 

Phil Zohrab is the Managing Director of Ground Control Data, which specializes in technology to help advertisers better understand, predict and optimize advertising outcomes.

Alex Langshur is the founder and former Co-CEO of Cardinal Path, a Merkle company, and once led Dentsu's global Google Practice. Alex now leads the North American sales division for RescueMetrics.

Measurement models - a flawed system

Alex coins the genesis of undoing for measurement models perfectly - we have been blessed with an explosion of new media opportunities. Meaning that data sources and channels to allocate media budget have become so broad.

We once relied on MMM and econometrics for budget allocation decisions, until the models were no longer reflective of the degree of variation possible in a customer journey. 

Multi-touch attribution stepped up to give marketers visibility over the impact of various touch points in the journey. However, as Phil points out, this model fails to consider offline and brand marketing impacts. Without the consideration of these elements, models end up with misleading and over inflated reporting on direct traffic. This can make for biassed predictive insights.

What’s more, multi-touch models are notoriously reliant on cookie data to map and analyze the customer journey. The models are now limited in how many touch points they can attribute to one user due to shortening cookie windows. The result is returning visitors being reported as new users, breaking attribution and journey insights. Algorithms starved of robust and comprehensive data are failing to credit the impact that higher funnel activity has on driving conversions. 

Here’s where Alex completes the picture on how today’s models and other algorithms are causing media budget wastage and neglecting key audience profiles during targeting and optimization.

The problem is signal loss. Attribution models and the broader AdTech stack need a quality data signal in order to deliver value. If the data set is skewed in any way, then over indexing is a given.

The guys reflect on Google’s refactor of UA to GA4 as evidence of the scale of impact that quality signal mass has on effective modelling. The GA4 migration pains have been felt globally, yet Google still risked industry uproar in the pursuit of better conversion modelling.

So why is quality data so elusive and why are the trusted models of today at risk?

Alex points to popular new browsers like Brave, along with VPNs and Adblocking software as primary sources of signal loss. These third party players are blocking AdTech pixels and tracking tags from firing, even within the browser of a consented user. The widespread use of this software, along with iOS interventions, is disrupting over 50% of consented data in some cases.

Not only is this disruption eroding the signal mass serving measurement models, but it is also breeding bias. Research finds that specific audience profiles are partial to software causing signal loss. Young, male, tech savvy and affluent audience profiles are more likely to use adblockers, iOS devices, VPNs and privacy centric browsers. This is why algorithms are under indexing this audience.

Audiences impacted may have consented to tracking onsite, or be active in walled gardens like Facebook and Google, but their data is still getting blocked. Alex explains how this audience is being excluded from A/B tests and algorithms are being skewed away from optimization toward these high value segments. 

Alex gets statistical with us and explains the concept of a bi-modal data set. By considering that Android and iOS user data is managed very differently, regardless of user consent preferences, measurement models are being fed a bi-modal data set. Research tells us that young, affluent, higher education presenting users will be over-represented in the iOS data distribution. 

The danger here, he explains, is that the average generated from the mode of the Android- iOS data pool will be in a trough. And if your AdTech is optimizing for a trough, then it's targeting profiles least likely to convert and increase ROI.

Regardless of the model used, a meagre data input lacking representation of audiences inclined to ad blocking extensions, will result in models amplifying bias. 

The Future of Measurement

It’s during this part of the chat that we see Phil really get excited, because they start talking about generative marketing attribution as the future of measurement.

Generative marketing attribution leverages people based data and demographics, by working back from captured conversions. The engine considers the Consumer Decision Theory, factoring in the predisposition and propensity for conversion based on demographics. Multiple models will drive the data analysis delivering reports and predictive insights.

Phil speaks to one of the greatest strengths of generative attribution: that it doesn't rely on historical data. As he points out, the vast variability in the people-based data feeding the algorithm means that we don’t need to go back in time to get signal mass. 

COVID-19 showed us how quickly online and offline consumer behaviour can change. Suddenly the habits of our past are not reflective of today’s customer journeys. Phil coined this the ‘Covid taint’ and relayed the frustration of marketers that AdTech today is optimizing based on data representing previous customer journeys less prevalent in today. Generative attribution solves this problem by omitting redundant data of the past.

This future of measurement is holistic and omni-channel. It boasts intra-campaign optimization and the ability to test and learn, live. Plus with only two weeks required to adopt the tool, no wonder Phil is excited.

But as Alex points out, even the smartest measurement model cannot deliver accurate insights and optimisation without a quality input signal, free of bias.

The guys agree on another point, marketers should be auditing their tags as a part of good data hygiene practices.

Final words of wisdom if you are spending on media

From Alex: In a budget tightening climate, it’s crucial to be curious about your data quality.

From Phil: We are seeing increased industry investment in CX, UX and A/B testing tools, but marketers need to be aware that signal loss is skewing data in these tools. An estimated 1 in 5 users will be excluded from tests, because of signal loss. 

“You are leaving money on the table” by allowing this audience to be excluded, says Phil.

Alex agrees, highlighting that Boston Consulting Group research found that companies investing in these tools will see better returns because accurate targeting and optimization is impactful.

Related resources

No lengthy IT projects, maintenance, or workflow configuration required.

See results in a matter of days, after a 3-step set-up.