Lurking in the Internet’s Shadows

Nick Agar
4 min readNov 18, 2021

--

Caroline Keller

The loser living in his mom’s basement is no longer the monolithic internet troll casually crusading from his couch. The socially awkward basement dweller archetype has been replaced by the bored housewife, the spunky social media influencer, the unhinged politician. Basically, anyone with opposable thumbs and access to the internet can ignite a conspiracy theory or propagate it. It was a mere dozen people who were responsible for 65% of COVID disinformation, none of which are picking Cheeto crumbs out of their beard hairs.

Scrapping this archetypal perception is important, because it de-weaponizes the ‘other’ and positions it within arm’s reach — ‘it could be anyone’. It is much easier to point a finger and discredit a manufactured idea that comes from ‘them’ than it is from one of ‘us’. No one would suspect their neighbor Barbara, who crochets sweaters for her Shih Tzu, as a misinformation super spreader.

In light of the pandemic and the concurrent “infodemic”, the gravity of naming these ‘foreign’ actors as our colleagues, chiropractor, trainer…and maybe even ourselves is more critical than ever. A malignant idea has no roots unless continually nurtured. Thus, the source of the problem is not the mere existence of disinformation but the cultivation by those who reproduce and spread it, most of which I like to believe are not malevolent. A 2021 Pearson Institute/AP-NORC Poll found about half of Americans are concerned that they have unintentionally shared misinformation.

If you spend some time on social media dizzily scrolling through pages of content, a pattern begins to emerge. Innocuous chat rooms on holistic medicine are breeding grounds for anti-vaxxers. Instagram acts as a playground for entertainment-hungry pandemic endurers rebroadcasting any dubious Wuhan lab meme that makes them LOL. Why are these social networks so susceptible to deceit? And why do they act like conduits for disseminating misinformation?

According to the work of Dr. Kathleen Carley at Carnegie Mellon’s Center for Computational Analysis of Social and Organizational Systems, there is a reproduceable framework supporting the successful spread of disinformation and misinformation. First it starts off with a legitimate concern. Think, I don’t want my child to be a test subject of a novel vaccine. Next, images, tweets, forums emerge building and bridging disparate groups. The rightfully concerned mommy bloggers then become indistinguishably linked to anti-vaxxers. Finally, the message becomes distorted and shifts to the rhetorical “my rights” argument Americans will die on. The legitimate fear has subsequently blossomed into the defense that it is a right to choose to vaccinate. Game over. The inevitable ideological battle ensues with an infinite timeline.

This orchestrated homophily makes it easy for us to consume and perpetuate misinformation. When one friend buys into the propaganda with a justified concern like the one mentioned above, it becomes easier to rationalize their beliefs. Or it becomes easier to look the other way. A dismissive — oh, they’re just worried.

So how do we stop sowing seeds of misinformation and nip these growing weeds? In a polarized country like America with an anathema towards authority, we may not be able to change the way people behave online through legislative regulation or de facto law. Initiating a conversation with a spreader may also live up to the same doomed fate. Instead, we must focus on education and encourage introspection. Research shows the promise of media literacy in teaching people how to identify the signs of misinformation. Online games simulating fake social media feeds and the development of educational resources have demonstrated success but whether

those debunking skills transfer to the real world when bombarded by the misinformation frenzy playing out on your personal screen is still yet to be tested.

Equally important as understanding the ways not to fall victim to misinformation (i.e. understanding the anatomy of misinformation and how it maneuvers to take advantage of our emotions and sense of identity), is learning how not to contribute to its amplification. An elucidating example is understanding how humor in memes are used to conceal disinformation. On the surface, memes are funny. However, a Dr. Anthony Fauci meme posted by your MAGA-loving Aunt sits differently than the one posted by your scientifically grounded coworker. As soon as the context shifts so does the implication. It is then easy to see how those we view as our tribe distorts our perception. Pausing and refraining from the instantaneously gratifying dopamine release of reposting may slow the spread of misinformation to a trickle.

Now, it’s time to stop othering. We need to stop fooling ourselves that culpability resides with the basement gremlin or nefarious foreign actor. It’s about time we shift the narrative and bring the propagators out from under the shadow. Hopefully, the suns ray will act like a natural antiseptic instead of abetting a flourishing photosynthetic exchange of misinformation.

--

--