With message testing, popularity does not equal persuasiveness

By David Shor

Civis Analytics
The Civis Journal
3 min readJun 13, 2018

--

Following the news sometimes feels like a full-time job. There’s always something new breaking, an election happening, or a Tweet making headlines. It is harder than ever for campaigns to know which messages work, and which don’t (or worse, have backlash).

As a campaign consultant, you would think that I would know, roughly, what messages will work best in any given race. But neither I nor the campaign team know for sure. That’s because the way campaigns have historically tested messages measures popularity, not persuasiveness. It’s easy to make an ad that’s well-liked, but just because something is well-liked, doesn’t mean it changes people’s minds.

A client recently wanted to understand which of a candidate’s accomplishments would most improve voters’ opinion. We tested 20+ different accomplishments, asking voters if they approved of the issue. The results made sense: the most popular accomplishments were in line with what we expected. At the same time, we also measured how much awareness of the accomplishment changed their support. Strikingly, we found zero correlation between what people said they approved of, and what actually persuaded them to support the candidate.

Relying exclusively on focus groups and messaging pollsters is no longer enough. There, respondents will tell you what’s popular, but that doesn’t mean it will be persuasive. The only way to see if something is persuasive is to show people the ad in a randomized controlled trial, and then measure results afterward.

We saw this first-hand when working on the Doug Jones campaign in Alabama last December. Opponent Roy Moore was mired in controversy and scandal, so it seemed intuitive to focus Jones ads on Moore’s reported misconduct. But when we tested the messaging, we found that ads emphasizing Jones’ ability to improve education in the state were in fact the most persuasive. While people might have responded favorably to ads focused on the Moore scandal, the media had publicised it enough that additional messaging didn’t actually change minds. Interestingly, we also found that messages aligning Moore with Trump didn’t persuade voters to support Jones — rather, it dissuaded certain groups to vote at all.

Getting it right matters, and it can’t be done based on intuition. In a meta-analysis of 100+ general election ads we’ve tested in the last four years, we found that the top 10% of ads were, on average, more than five times as persuasive as a typical ad. We also found that about one-third of ads tested actually caused a backlash, reducing support in key subgroups. Not only can campaigns optimize their budget with the right testing; without those tests, campaigns might be paying for ads that actually hurt their cause.

Creative Focus demo results for Parks and Recreation’s Leslie Knope

If you’re interested in the technology we used for the Jones campaign, drop us a line here: https://www.civisanalytics.com/creative-focus/

--

--