Psych! Why We Fall For Disinformation

Hannah Kruglikov
Foundation for a Human Internet
5 min readJul 15, 2020

While there is a lot of effort and coordination that goes into a successful disinformation campaign, the final–and most crucial–player in a disinformation scheme is us. After all, the goal of a disinformation campaign is to influence the conversations we are having, to manipulate our thinking, and to blur the line between fact and fiction.

Why can’t we just...not fall for disinformation, then? Unfortunately, this is not as simple as you might think.

While many of us may believe that we are above trusting disinformation, the reality is that our likelihood of believing false information is not a matter of intelligence or education, but of human psychology.

Source: Getty Images

Here are some of the ways in which our brains can be tricked into believing disinformation:

Cognitive Laziness

To begin, studies have shown that humans generally do not critically consider new statements or information that we hear, but rather make a quick and intuitive judgment about its validity, particularly if it sounds familiar or “fluent” to us. The easier it is for us to process a piece of information, the less time we spend thinking about it and evaluating it, which makes it less likely that we will notice a small (but mighty) inaccuracy in it.

Features that can make a piece of information more cognitively fluent to us–and thus increase our likelihood of engaging in lazy thinking–are visuals and similarity to common (familiar) statements. For example, a study from Colorado State University found that people are more likely to believe findings from cognitive research if there is an image of a brain in the report, while another study from The Australian National University found that this effect holds with the use of images across a wide variety of topics.

Convinced? Source: Sydney Neuroimaging Analysis Centre (SNAC)

Similarly, mention of a reputable source in a fake news story or headline makes it sound more similar to a real news story or piece of information, which increases its fluency, making us more likely to deem it true on a cursory glance. By utilizing these features, those spreading disinformation can ensure that many of us will register it as true without ever giving it a closer look.

Confirmation Bias

Another reason why people are more susceptible to disinformation is confirmation bias, a phenomenon wherein people are more likely to believe a piece of information if it aligns with or supports their preexisting beliefs.

For example, a 1979 study conducted by Stanford University found that students holding a particular belief, when presented with two equally-compelling fabricated studies–one supporting their belief and one opposing it–were more likely to accept the data which supported their belief and to deem it credible, whereas they were much more critical of the data which did not support their belief.

It is easy to see how this might carry over into disinformation, as people are compelled to believe any piece of information which supports their preexisting beliefs, and will subject it to far less scrutiny than a piece of information that does not support their beliefs. Thus, a piece of disinformation has a better chance of catching on with a group of people whose beliefs are supported by it–better yet if they can target those people in particular.

Source: rawpixel via Unsplash

The Illusory Truth Effect

The Illusory Truth Effect simply refers to the phenomenon in which we hear a piece of information so many times that eventually, we come to accept it as a fact.

This effect is present all over the media–perhaps most notably among extremely-vocal communities such as climate deniers and anti-vaxxers, which are often given platforms on which to present their views. While those people who staunchly believe these views to be false might never be swayed no matter how many times they are exposed to them, those who are less certain may find themselves nudged towards these alternative “facts” with each exposure until eventually, they come to view them simply as facts, sans conditions (and sans quotations).

This effect is strongly associated with the concept of cognitive fluency, as fluency comes from familiarity. Thus, having heard a piece of information before makes it feel more fluent to us, allowing us to process it more quickly and easily and reducing the likelihood that we will approach it with any sort of critical thinking. This can occur with disinformation campaigns (political or otherwise), as they are orchestrated to achieve the maximum possible reach, and with the same information popping up on your feed over and over–hey, it must be true.

What can we do about it?

Some of you might be thinking, another name for this article could well have been “Why We Don’t Look Closely at New Information”–and you’d be right.

The answer, then? Learn to look closely.

Some people are naturally and constantly analytical; suspicious; careful–but many of us aren’t (yes, even smart people). We ingest a huge amount of information in a day, and we may pick and choose when to apply our full brain power into thoughtfully and carefully evaluating the validity of that information. In order to combat this, we have to make a conscious effort to think critically when looking at a new piece of information, rather than falling back comfortably onto our intuition. It may take some effort, but we hope you’ll agree that it’s worth it to combat the spread of disinformation.

If you’re wondering, “What even is disinformation?”, then check out our disinformation series, where we cover everything from the problem, to the anatomy, to the solution.

If you like what you’re reading, be sure to applaud this story (did you know that you can hold down the applaud button and it’ll keep adding claps–it’s addictive!) and follow our channel!

What’s humanID?

humanID is a new anonymous online identity that blocks bots and social media manipulation. If you care about privacy and protecting free speech, consider supporting humanID at www.human-id.org, and follow us on Twitter & LinkedIn.

--

--

Hannah Kruglikov
Foundation for a Human Internet

UC Berkeley Economics, Class of 2021. Marketing and Research for humanID. Check us out! https://www.human-id.org/