The Importance of Data-Driven Creative

Nathan Hugenberger
Known.is
Published in
6 min readApr 6, 2022

Getting the creative right is the most important driver of advertising effectiveness.

A photo of the word “data” on a screen. The screen is reflecting a house.
Photo: Claudio Schwartz/Unsplash

At Known, we’re dedicated to science and art working together to solve any marketing challenge. While we are obviously big fans of data-driven approaches to media tactics like advanced targeting and micro-segmentation, reach optimization, sequencing, and bidding optimization, we ultimately know that creative is king. It should be optimized at least as carefully as the rest typically are.

In the coming weeks, we’ll share insights on the best ways to optimize creative, from the role of survey-based testing to multi-armed bandit optimization. But first, let’s examine why King Creative needs data to rule the roost.

An infographic depicting the percent sales contribution by various advertising elements.

Creative accounts for at least half of the variation in campaign effectiveness.

Scientific research consistently shows that getting the creative right is the single most important driver of advertising effectiveness & efficiency. Multiple studies, including an excellent one from Nielsen, have found that differences in creative efficacy accounts for half of effectiveness variation, more than all other media planning factors put together. Some excellent research has been published over the past few years that consistently lands this point.

More recently, marketing analytics firm DJV Insights found that less likable or distinct TV creative required about 4 times as many gross ratings points to achieve the same effectiveness.

What does that mean? It means that the advertisers top priority should be to ensure they are using great creative — find the right mix of images, story, tone, message, and CTA for the audience.

The difference between great and mediocre creative is huge.

Not only is creative efficacy or quality the most important part of the ad effectiveness equation, it turns out that the swing between bad and good is huge. Many data points show that great creative is many times better than mediocre creative. The OVK study found that “high-quality creative increases ad viewing time six times (5.8x) and nearly doubles purchase intent (+93%) vs. low-quality creative.” This is not just true for the digital ads OVK studied in OVK, Persado found it was also true for email CRM and other text heavy campaigns. Their 2019 whitepaper shows a 400% difference in effectiveness between the best and worst copy.¹ It stands to reason that this would be especially true for new campaigns or upstart brands where limited historical data on advertising effectiveness is in place to guide decision making.

At Known, we’ve seen in testing that the best performing creative assets can score up to six times higher than the worst performing assets. This graphic shows the relative difference between the best performing and worst performing static display and social ads across several recent campaigns.

An infographic of a bar graph showing relative KPI performance of Best Performing and Worst Performing Creative

The huge deltas make sense when you dig deeper. We found that changing a single word in a recent campaign led to a 21% reduction in the client’s cost per lead. If changing a single word can do that, imagine what changing all the words and all the imagery could do.

Don’t trust human judgment. You need data to pick good creative.

Unfortunately, academic research also shows that humans are not very good at judging the best creative without good data and testing, and that includes marketers, experts, and agencies. A 2016 Ehrenberg-Bass Institute study found that marketers were not good at predicting which television ads are more or less effective on sales — in fact, the marketers were no better at choosing than random guessing. The lesson is clear; don’t trust your intuition when it comes to picking which ads to put your media dollars behind.

Even supposed experts are poor judges of what is good creative. A 2019 report by Peter Field for the Institute of Practitioners in Advertising (IPA)² found that campaigns that win awards are not more effective than other campaigns, a drop that has been happening for decades. This trend is especially concerning, because it suggests that what the creative industry particularly values — what wins awards — is not what drives the most value. In that case, brands and marketers need testing and data-driven processes in place to counteract the implicit incentives in the ecosystem.

We can motivate this point with a few further examples…

Sometimes our intuition gets the direction wrong — we are surprised to find out that what worked better is the opposite of what we expected. In a recent Known campaign, we found that black-and-white images outperformed their color counterparts by 23%. This surprising result — which may have been very specific to the particulars of that brand and that campaign — would not have been discovered without careful testing, an evidence-based culture, and a tight collaboration between the media, data, and creative teams.

Sometimes our intuition gets the magnitude wrong — we are surprised at how big the difference is. Look at the two ads below. At a high level, they are very similar. They use the same color, the same layout, and the same branding. The only differences are a handful of words and the person shown. You might suspect there will be a performance difference between the two, but can you guess which is better? And by how much?

An image of two display ads, one featuring a man and one featuring a woman

Would it surprise you to learn that one of these ads performed twice as well as the other? Twice! That is like doubling your media budget. Without careful and comprehensive testing — and the creation of assets to test — that opportunity would be missed.

No testing is dangerous, but just robotic testing misses opportunities.

Given the evidence, it’s clear that every brand should test their creative assets, either before putting them in market or live using a real-world, test & learn experimental approach.

But if you are thinking “we already do the testing we need; we do dynamic creative optimization (DCO),” think again. DCO can solve many of these creative feedback problems, but it can be slow because it tests too many permutations early on, which extends the time it takes to find the right direction for further optimization and further asset/element ideation.

You don’t get information on what assets to start with or how to start small. This approach most affects new campaigns and brands. In addition, you can’t merely leave things to the robots, AI and machine learning. To perform DCO really well requires plenty of human oversight and strategic thinking.

Instead of conducting a random test without any hypotheses, we think it is critical to structure well defined hypotheses and statistically valid tests to measure the effectiveness of creative assets. By having a team of data scientists to design and drive these experiments, we allow art and science to work together to produce top performing campaigns that feature deep insights. Bringing the best of data and creative together is much smarter than going with your gut. The key is to find the optimal path quickly, and to ensure you focus further testing and creative budgets on the most promising directions.

Read more from Known on media buying.

Want to learn about our testing capabilities? Reach out.

--

--