Pseudoscience

Catherine Morris
Media Studies COM520
4 min readOct 5, 2021

--

The EAVI chart describes pseudoscience as fake news that “misrepresents real scientific studies with exaggerated or false claims” and “often contradicts experts.” Pseudoscience has the potential for high impact, and it is usually motivated by money and power. Pseudoscience has extensive relevance to current events. Not only do we have the chance to investigate pseudoscience during a global pandemic but immediately following a nationwide vaccination mandate being met with huge resistance. We can trace the long-lasting effects of pseudoscience with the example of vaccine hesitancy. In 1998, Andrew Wakefield presented false research that vaccines cause autism. Jenny McCarthy became a spokesperson for this pseudoscience, causing it to spread like wildfire. This has been debunked again and again. And yet, almost 25 years later, vaccine hesitancy is as widespread as ever, easily demonstrated by the vaccination rate in the US, which is still only 57% despite over 703,000 deaths from COVID-19 (CoronaVirus in the U.S.)

One of the defining features of pseudoscience is that it presents itself as scientifically and academically valid, despite the fact that it ignores what makes scientific research credible, like methodology, peer review, or expertise. This is what separates pseudoscience from a simple difference of opinion; pseudoscience intentionally disguises and presents itself as being more credible than it really is. Even though pseudoscientific theories don’t have the credibility of real science, “on user-generated content platforms like YouTube, these are often presented as facts, regardless of whether they are supported by facts and even though they have been widely debunked” (Papadamou 2020). The accessibility of public-facing sources leads them to be popular over more challenging, jargon-laden scientific research. If there are easy to understand facts on YouTube and harder to understand facts that require interpretation coming from more credible sources, why wouldn’t the YouTube facts be more popular? The problem comes down to the fact that real science and pseudoscience are presenting themselves as equally valid.

In the age of the internet, pseudoscience has more opportunity than ever to present itself as valid, factual information, and once it develops some traction, it spreads quickly. Since the beginning of the COVID-19 pandemic, the CDC has made every effort to guide people in the direction of safety, but “along with these official recommendations, people are exposed to pseudoscientific information and unverified content pertaining to COVID-19, which have proliferated rapidly through social media” (Teovanovic 2020). People have justified these false claims by trying to reduce the CDC’s perceived credibility. For example, when the CDC changed their stance on mask-wearing in the early stages of the pandemic, people started propagating the idea that this change of information was a clear sign that even the experts don’t know what’s going on. The inability to accept that valid scientific knowledge changes with developing research is nothing but widespread cognitive simplicity. Nonetheless, by disparaging the valid scientific information we all received, these pseudoscientific pieces of misinformation created a false appearance of credibility for themselves, and they spread rapidly online. Part of the online spread comes from algorithmic interference. Algorithms don’t judge whether information is true as much as they judge people’s interest in it. The results of a study found “a non-negligible number of pseudoscientific videos on both the video recommendations section and the users’ homepage” (Papadamou 2020). Once a piece of information, true or otherwise, becomes popular, it only becomes more likely to spread as social media websites assume that people want more of the same content.

The existence and spread of pseudoscience aren’t where its impact ends; when pseudoscience spreads, even if it is later debunked, it creates the illusion that there are two valid sides to a story. This creates confusion and mistrust toward legitimate scientific sources because they don’t seem to tell the ‘whole story’. This is the philosophy behind Russia’s interference tactics that date back to the spread of disinformation about the AIDS virus in the 1980s (Strudwicke 2020). In an analysis of Russian tweets posing as American-originated, they found that climate change was 8.6 times more likely to appear in the Russia Tweets than the control sample of normal Twitter users. Tweets about vaccination were 11.7 times more likely from the Russia Tweets than the control sample (Strudwicke 2020). What this does is create an appearance that there are more conflicting opinions on a controversial topic than there really are. With this appearance of disagreement and contradiction, valid sources have a harder time convincing the public of their credibility because there seems to be such widespread and popular disagreement.

“Coronavirus in the U.S.: Latest Map and Case Count.” The New York Times, The New York Times, 3 Mar. 2020, https://www.nytimes.com/interactive/2021/us/covid-cases.html.

Papadamou, K., Zannettou, S., Blackburn, J., De Cristofaro, E., Stringhini, G., & Sirivianos, M. (2020). “ It is just a flu”: Assessing the Effect of Watch History on YouTube’s Pseudoscientific Video Recommendations. arXiv preprint arXiv:2010.11638.

Strudwicke, I. J., & Grant, W. J. (2020). #JunkScience: Investigating pseudoscience disinformation in the Russian Internet Research Agency tweets. Public Understanding of Science,29(5), 459–472. doi:10.1177/0963662520935071

Teovanovic, P., Lukic, P., Zupan, Z., Lazić, A., Ninković, M., & Zezelj, I. (2020). Irrational beliefs differentially predict adherence to guidelines and pseudoscientific practices during the COVID-19 pandemic. doi:10.31234/osf.io/gefhn

--

--