Prebunking to Build Defenses Against Online Manipulation Tactics in Germany

Published in
5 min readOct 25, 2023

As false and manipulative information continues to spread online, our team at Jigsaw has been working to scale the impact of the work we’ve designed to build resilience to online harms. While many misleading claims and narratives proliferate online, there are a small number of manipulation techniques that are repeatedly used to spread them. Building on our previous work and this learning, we launched a video campaign in Germany this summer on Facebook, Instagram, and YouTube to help educate German audiences about three of these manipulation techniques commonly used to spread disinformation in their country. This “prebunking” approach has been found to build resilience to manipulation in the future. We’re excited to share that over 50% of the online audience on each platform saw our videos and the highest proportion of viewers learned to spot manipulation of any prebunking campaign to date.

Prebunking, an evolution of work pioneered in the 1960s by social scientist William McGuire, seeks to reduce the effectiveness of misinformation by preemptively warning individuals of attempts to manipulate them and providing a microdose of the false claims or tactics likely to be used to influence them. Together with Google Germany, our implementation partner Moonshot, and local organizations Correctiv, Alfred Landecker Foundation, Amadeu Antonio Foundation, Das NETTZ, klicksafe, and Neue Deutsche Medienmacher, we built off promising learnings from our previous prebunking projects to address growing manipulation concerns in Germany. We were able to achieve over 42 million views across all platforms, including YouTube, Facebook, and Instagram. We reached 58% of YouTube users in Germany aged 18–54 and 54% of users on Facebook and Instagram, respectively.

To identify current and emerging misinformation narratives in the country, we took a mixed-methods approach, interviewing a series of local experts, with further contextual validation provided through cross-platform social media analysis. Our research found many topics that are frequent targets of misinformation — among them the energy crisis, climate change, migration issues, and the war in Ukraine. These topics are not always directly related, but they often feed into a meta-narrative of a ‘poly-crisis,’ or the idea of simultaneous catastrophic events. Disinformants often reference a poly-crisis to evoke fear and manipulate audiences when they’re feeling overwhelmed. Interestingly, there are a number of manipulation techniques that are used repeatedly across these misleading narratives. From our expert interviews and social media analysis, we found that three manipulative techniques — decontextualization, fearmongering, and whataboutism — were some of the most prevalent across topics.

We chose to take a technique-based approach to prebunking in Germany, focusing on one technique per video. By not tackling the swirling misinformation narratives in our videos, we were able to help viewers understand and identify the techniques that are used across narratives without politicizing our content.

We raised awareness of manipulation techniques across Germany

The primary goal of the campaign was to enable more Germans to correctly identify manipulation techniques online. We measured this using an innovative approach: within 48 hours after social media users watched a prebunking video, they saw a custom Brand Lift survey on YouTube. This survey asked viewers, and a control group who had not seen the prebunking video, to identify a manipulation technique being used in a sample social media post. We found that, on average, viewers were 5.4% better at identifying any of the three manipulation techniques than Germans who had not seen our videos. With a reach of 21M unique views on YouTube, that means over 1.1 million people in Germany demonstrated an improved ability to spot manipulation. The fearmongering video consistently improved viewers’ ability to detect examples of fearmongering between 4.5%-8% better than those who hadn’t seen the video. Similarly, viewers got 3.8- 5.8% better at identifying whataboutism and 4.2–6.2% better at spotting decontextualization.

The results were even greater among younger audiences. Viewers under 34 consistently demonstrated higher gains in their ability to identify techniques, relative to those who had not seen the video, with a high of 10.8% increase in 25–34 year old viewers’ ability to identify misinformation techniques after seeing our “whataboutism” video. The average increase for Whataboutism (18–34 y/o) was 7.1%, compared to the overall average of 5% lift, suggesting that this type of education about online manipulation is most needed for younger audiences.

“This was our most impactful prebunking campaign so far, not only in absolute reach but in building resilience to manipulation,” said Beth Goldberg, Jigsaw’s Head of Research and Development. “Across all three techniques — fearmongering, decontextualization, and whataboutism — viewers of our videos were consistently better at spotting these common types of manipulation. As a researcher, this type of consistency and large effect size at scale is really rare to see, so this suggests that prebunking is incredibly promising as a scalable approach to counter disinformation.”

Next steps: Iterating to build scale and impact.

This is the latest milestone in our ongoing prebunking efforts, which Jigsaw will continue to deploy and scale in additional markets in the coming months. We have found our local partnerships to be a key component to our efforts. In addition to improving our prebunking campaign creative was locally tailored and robust, our local partners have played a crucial role reviewing and providing feedback on campaign content at key stages, including production development, storyboarding, and content finalization. These contributions improved our campaign’s relevance and effectiveness and we will continue to work with local partners on future campaigns to ensure our content continues to be tailored, sensitive, and responsive to the concerns of local audiences.

As we continue to iterate our approach there are a few new areas of exploration that we are pursuing in the coming months, including identifying terminology that translates well to our audiences’ native language. In Germany we found that easily translated concepts performed better. For example, our video for “fearmongering,” a concept that translates easily to “Panikmache,” saw higher results, whereas “decontextualization” and “whataboutism” are more technical terms and loanwords that may not have been as easy for the general population to understand. When concepts do not directly translate we have to work to educate audiences on the definition of the concept as well as how to identify it.

We are eager to learn how local influencers can share prebunking messages effectively with their communities to help scale these messages with trustworthy messengers. We are also partnering with academics to develop a set of metrics that will help others measure resilience to manipulation more holistically.

The results of this campaign will now be shared with partners and other institutions to encourage similar approaches and help internet users in Germany and beyond proactively build resilience against manipulation.



Editor for

Jigsaw is a unit within Google that explores threats to open societies, and builds technology that inspires scalable solutions.