Fake News & Cognitive Biases 3

Why Are People Susceptible To Misinformation?

YS Chng
9 min readOct 20, 2019
Photo by Julia Raasch on Unsplash

In the previous post on Fake News & Cognitive Biases, we discussed about how people are most susceptible to misinformation when it is riding on a trending topic or during periods of chaos. But why are people even susceptible to misinformation in the first place? Has it got anything to do with intelligence?

In the final part of this mini-series, I will talk about how even intelligent people can be susceptible to misinformation, as long as it is misinformation that they are concerned about. I will share some examples of when this can happen, and some of the cognitive biases that amplify this effect.

Better Be Safe Than Sorry

It is a common perception that the less tech-savvy elderly or the not-so-bright individuals are the ones who are vulnerable to misinformation, and most people often think that they are smart enough to avoid getting tricked. This is known as illusory superiority, where individuals overestimate their own abilities compared to others. In one particular study, researchers found that 80% of participants evaluated themselves as being an above-average driver (McCormick et al., 1986), which is unlikely to be possible in a normally distributed sample, unless the sample is highly skewed towards good drivers.

Illusory Superiority. (Image by xkcd.com)

The truth is, anyone can be susceptible to misinformation, as long as the misinformation concerns them. Let’s take parents for example — parents are especially prone to misinformation that discusses about the health and safety of kids, because they often adopt the “better be safe than sorry” mentality. Thinking that the harm of not sharing an unverified piece of advice is worse than sharing it, health and medical misinformation often spread in communities of parents who believe that they are protecting their kids and the kids of other parents.

The Momo Challenge Hoax, that was spreading since last year and reached its peak in March this year, perfectly illustrates the problem of such a mentality. Momo is actually a sculpture created by Japanese artist Keisuke Aisawa in 2016 called Mother Bird. Unfortunately, the image of the sculpture was allegedly used to portray a character called Momo, who was hijacking the videos that children and teens consume, encouraging them to engage in self-harm or suicide. As it turned out, no such challenge was actually spreading on social media. But because of the “better be safe than sorry” mentality of parents, the widespread sharing of the warning probably resulted in more harm than good, because it might have generated interest in curious children, or gave undesirable ideas to malicious individuals, resulting in a self-fulfilling prophecy.

Japanese artist Keisuke Aiso (not responsible for the Momo Challenge Hoax) and his sculpture Mother Bird. A picture of the sculpture’s face was used to represent the character of Momo.

Similarly, as Halloween approaches each year, the urban legend about tampered candies being distributed during trick-or-treating starts to spread as a warning to all parents. However, no cases of strangers killing or permanently injuring children this way have ever been proven. A sociologist from the University of Delaware, Joel Best, collected newspaper reports from 1958 to 1983 in search of evidence of candy tampering, but he found that very often, these were actually cases of children fabricating stories to get their parents’ attention (Best & Horiuchi, 1985). The persistence of these false warnings have become so commonplace, that mischievous children or unsavoury characters may get ideas about committing copycat crimes. While some readers may remember reports of needles found in Halloween sweets in 2018, there have been no follow-ups of those incidents at all. In other words, the issue is probably not as serious as how the warning sounds.

Watch this Adam Ruins Everything video where Adam Conover explains about the myth of the poison Halloween candy:

Some may wonder where’s the harm in sharing these information just in case they are true. The point is, the ones who are susceptible to such misinformation are usually also the ones who have a stake in it, yet they seldom fact-check the information to make sure that it is true. To find out more about the psychology of the Momo Challenge Hoax and other child safety internet hoaxes, check out this other insightful article:

Fighting For A Cause

The “better be safe than sorry” mentality is not the only way one can have a stake in sharing information that is untrue. Very often, individuals who are passionate in fighting for a cause are susceptible to misinformation as well.

When people are suffering from a terminal illness like cancer, or have loved ones who are going through it, they are most susceptible to health and medical misinformation. Because the effects of conventional treatment such as chemotherapy may not always be apparent, and they often give the patient a poorer quality of life, patients and their loved ones are often desperate to look for alternatives. During this period of time, they are most receptive to information about alternative treatments being shared with them. The problem is that these people often fail to consider that if a more effective treatment was available, the medical industry would have already adopted it. Sadly, patients who get distracted by the hope of alternative treatments may start to neglect the conventional treatments that have been tried and tested, putting their own lives in danger.

Watch this McGill Office for Science and Society video showing how videos of alternative treatments trick us into believing that they are credible:

The people who share health and medical misinformation often do it out of good intention, even though they may not realise that the consequences can be quite severe. Similarly, people who share information about environmental issues may be well-meaning in wanting to do something about climate change. Unfortunately, the information may not always be correct, and the people sharing it may not have done their fact-checking. For example, the recent forest fires in the Amazon have gotten quite a bit of attention through the dramatic photos shared on social media, and the retweeting of sensational quotes like “the Amazon produces 20% of the planet’s oxygen”. As it turned out, the photos used to depict the forest fires were taken from past incidents, and the trivia about Amazon being the Earth’s lungs is also factually incorrect.

French President Emmanuel Macron tweeting an outdated photo of a burning forest from 1989, and quoting a wrong statistic about the Amazon producing 20% of Earth’s oxygen.

The point of raising these examples is not to say that conventional medical treatments are always effective, or that all information about climate change is false. These examples illustrate the fallibility in the way people consume information when they are concerned about a given issue. I would even dare to say that as someone who is passionate in fighting against fake news, I could very well be susceptible to misinformation about how we can solve the problem of fake news.

Dunning-Kruger Effect & Apophenia

At the start of this post, we mentioned that illusory superiority gives people the impression that they are immune to falling for misinformation. A related type of bias known as the Dunning-Kruger effect is an extension of illusory superiority, where the perception of knowledge in a topic increases disproportionately to the actual knowledge that an individual has. When people are concerned about a certain topic, they possess a little knowledge about it. But they do not realise that what they know is just the tip of the iceberg, which renders them vulnerable to misinformation especially when they become complacent.

Dunning-Kruger Effect. (Image by smbc-comics.com)

To quote Alexander Pope, “a little learning is a dangerous thing”, because the vacuum of information people have in a topic allows them to become victims of conspiracy theories. In order to fill that vacuum, some people engage in speculation and guesswork instead of simply trying to find the answers. They may even believe that they are in fact intelligent enough to see patterns in the little information that they have. This is known as apophenia, which is the tendency to mistakenly perceive connections and meaning between unrelated things. Fueled by the Dunning-Kruger effect and the lack of actual knowledge, believers of conspiracy theories can become very entrenched in their beliefs.

Conspiracy Theories. (Image by TheAwkwardYeti.com)

This exactly describes the situation of anti-vaxxers. As parents, anti-vaxxers are naturally concerned for their children, and they adopt a “better be safe than sorry” mentality. But because they are ill-informed about how vaccines work and what they actually do to the human body, they end up attributing unrelated symptoms to the side-effects of vaccines. As the community of anti-vaxxers grows, they start to feel empowered by their cause which helps to perpetuate the belief that vaccines are harmful, despite the fact that the 1998 research claiming that vaccines cause autism has already been revealed to be a fraud.

Watch this Vox video explaining how the the anti-vaccine movement came about, and why it is really just a misconception:

Conclusion

In the 3 posts of this series on Fake News and Cognitive Biases, I tried to answer the questions of what types of misinformation are people susceptible to, when are people most susceptible to misinformation, and why are people susceptible to misinformation in the first place. On hindsight, it all seems quite obvious and unsurprising — people are susceptible to misinformation that contain some truth in them, especially if it is a topic that concerns them and if the topic is trending.

Perhaps a computer scientist working on how to detect misinformation using artificial intelligence will ridicule the general and unsophisticated nature of this analysis. But finding general features is exactly the objective of this analysis, which hopefully captures the essence of all types of misinformation, regardless of whether it is a political disinformation, a conspiracy theory, or just a scam. Because misinformation of all types take advantage of our biases in a similar way, if we want to develop solutions to protect ourselves against misinformation, the solutions have to be generalizable.

Using general features, a machine learning program will probably fail at detecting what is and is not misinformation, because there will be far too many false alarms. But if we use the general features I have mentioned as a guide for people, we may be able to develop nudges to warn people whenever they are in a situation that they will be most susceptible. For example, based on the interests of their users, social media platforms can try to gently prompt them to consume a variety of viewpoints in those topics. Or when there is a trending topic of controversial nature, social media platforms can remind their users that misinformation is often rampant during such periods.

These suggestions may be still vague and unrefined, but it is actually the principle behind them that matters. There has been much effort in finding ways to detect fake news algorithmically, but instead of focusing on how to deal with the information, perhaps more can be done to counter the biases of people, and change the way they consume information online.

To find out what other nudging solutions have been suggested for solving fake news, check out this Guardian article:

If you would like to know more about the types of misinformation that people are susceptible to, check out the first part of this series:

If you would like to know more about when are people most susceptible to misinformation, check out the second part of this series:

References

  • Best, J., & Horiuchi, G. T. (1985). The razor blade in the apple: The social construction of urban legends. Social problems, 32(5), 488–499.
  • McCormick, I. A., Walkey, F. H., & Green, D. E. (1986). Comparative perceptions of driver ability-a confirmation and expansion. Accident Analysis & Prevention, 18(3), 205–208.

--

--

YS Chng

A curious learner sharing knowledge on science, social science and data science. (learncuriously.wordpress.com)