Can Instagram ads fight disinformation about climate and COVID vaccines?

RealityTeam
11 min readMar 21, 2022

--

Measuring the impact of ads on knowledge and opinions.

In brief: Reality Team runs ads on Instagram designed to limit the influence of disinformation. We developed a method to run randomized control trials to test the impact on knowledge and opinions about climate and covid vaccines. We saw very significant increases in knowledge 24–72 hours post exposure to a single viewing of a 10 second video ad, and shifts in opinions 7–18 days later within a specific audience of Passive Information Consumers.

Figure 1: The first 5 seconds of the “treatment” video.

Disinformation is a serious problem and the solutions aren’t here yet.

Disinformation and disinformers are actively confusing and deluding millions of Americans and far more around the globe. The overwhelming flood of lies and distortions leaves people uncertain and anxious. Anxious people are easily manipulated. Their anxiety degrades trust in experts and institutions. Even if they don’t believe any specific lie, they live in a fog of uncertainty.

Alarmingly, research shows that a person who believes one of these lies is far more likely to believe others. They become prey for extremists. They act against their own best interests — or fail to act at all.

The entities behind the thickening fog of disinformation promote rancor and lies, exaggerating and distorting societal ills while making it nearly impossible to solve them. Their actions threaten to dismantle democracy by discouraging participation, degrading governance, and debate, and leveraging that to usurp control of democratic institutions.

This is a ubiquitous, interconnected issue that needs holistic solutions. Effective counter-influence strategies are needed along with new regulations and anti-disinformation technology.

Reality Team came together to see what we could do right now, with the tools we can get our hands on today. Is it possible, we asked, to make it harder for disinformation to loosen people from reality and pull them toward false ideologies?

Figure 2: The second 5 seconds of the “treatment” video.

What can we do right now?

The goal is simple. Make it harder for “disinfluencers” to frighten, confuse and delude people. It needed an approach that could be put into play quickly. It had to be effective and couldn’t depend on help from the platforms, exotic technology, celebrity, or large sums of cash. It had to be ethical and legal.

As digital marketers, we turned to digital marketing tools. We knew this approach was unlikely to de-radicalize those already lost to delusional or extreme ideologies. People well anchored in reality didn’t need us. So we focused on the group we thought was both vulnerable and reachable. Some call them the “informationally adrift.” We call them Passive Information Consumers (PIC). They aren’t the only vulnerable group, but they turned out to be easy to identify and they immediately showed an appetite for our content. So we pushed ahead. We learned what we could about them from research, surveys, community leaders, observation and conversation. Turns out most of us know some PICs.

We know they see almost no news, but a lot of disinformation, in the form of memes, opinions, TikTok videos and other content that requires little attention or meaningful thought. We needed to compete for that attention with similarly frictionless, but credible, content. They aren’t going to read 1200 (or 500) words on anything.

Our strategy is pretty simple. We use targeted ads to inject truth-bombs and inoculation messages into social media feeds. This gives reality a chance to compete against disinformation for mindshare.

In other words, we make sure people who rarely see credible information, see some.

This accomplishes three things:

  1. At a minimum, these campaigns give people the opportunity to consider info that contradicts disinformation.
  2. It addresses the underlying fears that disinfo exploits and builds resiliency — less fearful people are harder to manipulate.
  3. It displaces disinformation — we’re literally buying back territory. If they’re looking at our post, then at that moment, they’re not looking at disinformation.

This approach can’t solve the disinfo problem, but it can slow down its influence — in proportion to the intensity of the effort.

Our campaigns ensure that non-news consumers are exposed to key facts, answers to anxiety-producing questions (is climate change hopeless? Are vaccines dangerous?) and tips to identify manipulative content. We strive to build assets that are simple, visual, and avoid patronizing or infantilizing their concerns or intelligence. We want to make people feel good, not bad. Smart, not dumb. Confident, not vulnerable. We include our carefully curated references in each post.

Does it work?

High levels of engagement — but we needed more info.

These campaigns began running in September 2020. The click-through rates were fantastic. Averaged across all campaigns in the last 20 months, the click-through rate was 5.7% at a cost per click of $0.18. It seemed that if nothing else, they were pretty good at getting attention. But we weren’t sure if we were actually having a meaningful impact on viewer’s knowledge or opinions. We spent a year trying to find a robust way to measure what, if any, impact these campaigns had on hearts and minds. We didn’t have the resources to contract with a public polling organization, we didn’t have the time, money or expertise required to study voter turnout effects. Lab simulations were interesting, but it was hard to recruit our quirky audience onto panels, and it wasn’t clear if the effect in a lab would be similar to what happened in real life.

But does it impact people’s thinking?

We consulted with academics, NGOs, and anyone willing to share expertise, data, or a good idea. We needed something simple, cheap, and reliable. It had to stand up to scrutiny. Ultimately, the digital team at the Yale Center for Climate Communication, and Dr. John Cook at Monash University’s Climate Change Communication Research Hub, who is a prolific researcher on countering climate disinformation, pointed out some hooks and approaches that allowed us to devise a new method to measure impact.

Randomized Control Trials.

The Instagram Video-Poll (IVP) method is a reasonably robust approximation of a randomized control trial that can be conducted in the field. It uses a platform where our audience spends a lot of time. It’s also fast, and relatively cheap.

While it took a year to figure out the methodology, it only took 3 weeks to run the first set of experiments. We leveraged research that showed the scientific consensus on climate change is a “gateway” belief — people who believe this one fact are much more likely to believe other facts about the topic. We used a quiz format that research shows earns slightly more attention than simple statements.

Figure 3: Single question instagram poll.

We targeted the control group with a one-question poll ad using a built-in interactive Instagram feature. We targeted a treatment group with a 10-second video ad showing just two images: the climate consensus quiz and the answer. People who watched that video were then targeted with the same poll ad, which they would have seen 24–72 hours after seeing the video. We included roughly 200 people in each control and treatment group for each trial of the experiment.

The poll widget allowed us to maximize engagement — including those reluctant to leave their Insta experience to take a survey. The poll asks a True/False question about the single fact presented in the video.

Yes, it does.

Bingo. The video watchers answered correctly at a significantly higher rate than the control group. In our Michigan trial, watchers’ knowledge improved by 22 points. In Pennsylvania there was a 43(!) point gain. But did that change their belief in climate change? Tricky. We decided to ask a follow up question to gauge their level of support for the general idea of “government investment in climate solutions.” Video watchers would see this poll 1–2 weeks after the time they viewed the video. We saw a notable increase of 17 points. We’ve since repeated the experiment with vaccine safety messages, and saw similarly dramatic results.

Figure 4: Correct survey responses in control vs video watchers.

Experiment 2: COVID vaccine safety campaign.

Our first COVID vaccine safety message looked at research by the deBeaumont Foundation that showed that people who were concerned about vaccine safety were most open to listening to messages from certain trusted sources, including doctors.

We found that the American Medical Association had surveyed its members on their vaccination status, and hoped this information would be effective for similar reasons.

Our prior work showed that a lot of people have real concerns about vaccine safety — it’s not merely political posturing. Vaccine fears are very real — and the disinformation ecosystem exploits those fears with completely false or wildly exaggerated risks. Message 2 was designed to give people a sense of how big the real risks are. Fortunately, the CDC had good data on serious side effects, and the National Weather Service had stats on the likelihood that an individual will be struck by lightning in their lifetimes.

Figure 5 F IVP Experiment 2: Message 1
Figure 6: IVP Experiment 2: Message 2

COVID Vaccine Safety Campaign Results.

Results were again striking. As in the case of the climate messages, the vaccine messages were very successful with this audience — both in of teaching them a fact, and in shifting their stated desire to get vaccinated. Note, of course, that we don’t know which of our respondents were already vaccinated, and which may have sought out a vaccine after this campaign.

Figure 6: Data from Experiment 2, showing strong increases in knowledge as well as vaccine intent.

Data limitations: Opt-In Bias and Black Box Optimization

It is important to consider the various issues that could be skewing our results. We admit that we were surprised by the size of impact we measured after a single exposure to a 10 second video. There are two potential sources of data skewing.

Differential Opt-In Bias.

Every experiment that depends on voluntary human participation has some kind of opt-in bias. People who choose to participate are different from those who do not. Some differences may be obvious, and others are not. In our case, we depend on two separate opt-ins that may be different from one another.

The first opt-in is into watching the treatment video, the second is in taking the impact poll(s). The people who opt-in to one, may be different from those who opt-in to the other. When we controlled for age and gender, however, the results held. There may be other differences in the opt-ins that we don’t yet understand.

Meta Ad Server Algorithm

The Facebook/Instagram ad manager is a black box. We know they optimize who they show ads to to maximize the desired campaign outcome for the client (us) and revenue for themselves.

Hence, we know that the distribution is not truly random, but we have no insight into how it does that optimization. We don’t know when or if they change the algorithm.

That said, the experiments are distorted by the same algorithms as the campaigns. So while this limits generalizability to other platforms or contexts, we believe the impact we measure is a legitimate indicator of the effectiveness of these campaigns to this audience on Instagram.

Going forward

  1. We continue to test and examine this method for measuring campaign impact. We should soon have results from a parallel lab panel that may give us some insight into the differential opt-in problem.
  2. We plan to ramp up activity in 2022, especially in states expected to have competitive elections. In 2020, disinformation exploded in those states, and is likely to do so again in the 2022 midterms. Topic areas will include Election Integrity, Climate Change, Pandemic/Covid vaccines, Dirty Disinfo Tricks, and Fear-mongering — those issues that some disinfluencers promote to frighten people and distract them from other issues.
  3. Thanks to the Knight Foundation’s 100 Million Project, we know that Passive Information Consumers are often non-voters. We have begun experiments running campaigns that target PIC, and tell them how and when to register to vote. Early data suggests that they are very willing to be lead to their state’s registration site. We look forward to keeping you updated on this effort.

We seek:

  1. Non profits or academics interested in replicating these results or applying our methodology.
  2. Funding for 2022 counter-disinformation campaigns on Climate, Elections, Covid-19 and other topics, depending on the situation and context.
  3. Organizations that represent and serve communities targeted by disinformation, who would like our assistance in fighting it. In particular, we are looking to become a resource for organizations representing Spanish speaking Americans, People of Color, rural and other under-represented groups.

For more information about Reality Team, the IVP method, or our data, please email info@realityteam.org or visit our website.

References

MIT Technology Review. (2021) Guess which states saw the most election disinformation in 2020

Pew Research Center. (2020). Americans Who Mainly Get Their News on Social Media Are Less Engaged, Less Knowledgeable. 18% of adults use social media as primary news source (same % as local and cable news); They are less knowledgeable, more likely to see false information.

Knight Foundation. (2020) The 100 Million Project: The Untold Story Of American Non-Voters.

De Beaumont Foundation. (2021) Americans Who Get COVID-19 Information from Social Media More Likely to Believe Misinformation, Less Likely to Be Vaccinated. Social media users far more likely to believe misinformation.

Harvard Kennedy School Misinfo Review: The causes and consequences of COVID-19 misperceptions: Understanding the role of news and social media

American Psychological Association 2019. The gateway belief model: A large-scale replication. Advanced Science News: 2017 Inoculating the Public against Misinformation about Climate Change “Misinformation on climate change has a significant impact on public perception.” Knight Foundation 2020 How Media Habits Relate to Voter Participation For younger adults (25–29) social media plays a lead role in shaping their political knowledge. Of those who do not vote, 54% don’t seek out news.

Plos One. Neutralizing misinformation through inoculation: Exposing misleading argumentation techniques reduces their influence. J. Cook. (2017).

The International Journal of Press/Politics. Testing the effectiveness of correction placement and type on Instagram. To best correct misinformation on Instagram or other visual social media platforms, use logic-focused humor corrections. EK Vagra. (2020).

Global Challenges. Inoculating the Public against Misinformation about Climate Change. Prebunking ‘Inoculating’ by exposing a small dose of misinformation and explaining the lie or fallacy can neutralize misinformation on climate change, vaccines, and other issues. Van der Linden, et al. (2017).

Kaiser Family Foundation 2021 : COVID-19 Misinformation is Ubiquitous: 78% of the Public Believes or is Unsure About At Least One False Statement, and Nearly a Third Believe At Least Four of Eight False Statements Tested

--

--

RealityTeam

We fight the effects of disinformation by targeting the social media feeds of people who don’t see enough news. We make factual info, simple, visual and quick.