Exploiting Our Minds and Feeds: Why Disinformation is Empowered by Social Media

Matthew Castle
SISDRT
Published in
6 min readJul 12, 2020

Over the last two decades, social media platforms revolutionized the way the public consumes and shares information. With missions seeking to “bring the world closer together” and offer everyone the “power to create and share ideas and information”, these platforms have democratized the dissemination of information, fundamentally disrupting the role of the traditional media establishment.

Unfortunately, bad actors working to spread disinformation have also taken notice of this changing information ecosystem. Disinformation campaigns are designed to spread false or misleading information to deliberately deceive and influence public opinion. These campaigns, which once took months or years to induce a discernible effect, can now be spread digitally with astounding speed and reach. But beyond this heightened speed and reach, what makes modern disinformation campaigns particularly successful in influencing the public? Simply put, these campaigns are empowered by the core structures and features of social media platforms, and exploit certain cognitive limitations of the human mind.

Lofty mission statements aside, social media platforms are of course competitive firms with a profit incentive. Attention is the commodity in this digital age, engagement the currency.[i] And because attention is the principal commodity, social media platforms are designed to capture, manipulate, and measure attention.[ii] To that end, these platforms endeavor to be one-stop shops for information, offering content ranging from updates by family and friends, to advertisements for the latest gadget, to the most important news stories of the day. Therein lies one principal problem: these efforts to capture attention in so many disparate areas also divide that attention, pushing users into a state of “continuous partial attention” as writer and former tech executive Linda Stone calls it.[iii] This cognitive state can significantly diminish the ability to think slowly and deeply about any one topic, such as provocative headlines from suspect sources. As the brain bounces from topic to topic, from a funny cat video to an article shared by a friend regarding the election, users are primed for manipulation.

The market for information has also faced a transformation in the digital era. As the availability of information exploded with the rise of the internet, the costs associated with producing, storing, and disseminating information have fallen. For the consumer, high-quality information can be found with minimal effort and cost.[iv] As a result, the arresting character of information is becoming one of the most important differentiating factors in the heavy competition for attention. Some researchers even contend that high-quality information simply has no competitive advantage.[v] Now, media websites with varying levels of credibility will emphasize the production and spread of attention-grabbing content to compete for clicks and the valuable advertising dollars they represent. A website and corresponding social media accounts can be set up in a matter of hours with low-cost, professional web hosting tools, resulting in a vast number of content producers who can replicate the look and feel of trusted media sources. For example, the website Nile Net Online, which amassed over one hundred thousand followers across its social media accounts, looked the part of a normal outlet and promised Egyptians “true news”. However, it turned out this entity was actually part of an influence operation based out of Iran.[vi]

Along with these issues of quality, the sheer volume of content encountered on social media is proving to be a cognitive challenge for humans. Research shows that individuals are poorly equipped to assess and cope with so much information, and will quickly discard what they feel is irrelevant or unwanted.[vii] To complicate matters further, “selective exposure” leads individuals to prefer information that confirms their existing beliefs, while “confirmation bias” renders information that aligns with one’s existing beliefs more persuasive.[viii] The very design of social media sites can contribute to amplifying bias and accepting information at face value. Content recommendation algorithms are at the heart of these platforms, and generally take advantage of a user’s previous engagement data, coupled with the data of millions of other users, to curate personalized content. The content is refined further based on a user’s friends and family, as well as pages that are “liked” and “followed”. The result can be a sort of echo chamber where users’ beliefs are continually reinforced and the perceived believability of content is high. Naturally, when it comes to critical evaluation of important news and information, this can be a major problem.

Sharing of this curated information is also built to be quick and easy, allowing users to propagate an appealing photo or article with a mere click. Research shows that individuals depend on their social networks as trusted news sources and are more likely to share a post if it originates from a trusted friend.[ix] Additionally, platforms often spotlight the sharer of information better than the source, further priming users to evaluate the credibility of the messenger more than the message. In one study of story sharing on social media, researchers found that even after individuals were informed that a story had been misrepresented, more than a third still shared that story.[x] This relates back to the reality that social media platforms are designed to optimize engagement, and as Stanford Internet Observatory research manager Renee DiResta put it, “our political conversations are happening on an infrastructure built for viral advertising.”[xi]

All of this considered, it should not be surprising that social media platforms are ripe for exploitation by bad actors using disinformation to shape narratives, create confusion and sow division. Some of the most powerful disinformation efforts have also preyed upon emotion, which is another dimension that can profoundly impact the spread of disinformation. Examples include the Russian Facebook campaign in the United States designed to inflame racial tensions, Bahrain’s disinformation promoting conflict between Shi’a and Sunni Muslims, and Chinese disinformation intended to undermine political trust in Taiwan.[xii] In all of these instances, the issues were emotionally charging and the subsequent spread of disinformation was more pronounced. Findings from a large study of disinformation on Twitter between 2006 and 2017 found that false information “diffused significantly farther, faster, deeper and more broadly than accurate information — propelled by emotional reactions such as fear and disgust.”[xiii] Bot and troll accounts controlled by foreign actors amplify divisive narratives from fringe sites, while foreign advertisement purchases can drive desired narratives to target audiences. As social media platforms continue to attract users and serve as a primary news feed for many, these problems will only continue.

Disinformation is a global threat to discourse and democratic institutions. The public has finally begun to take notice, and widespread study of this phenomenon is in progress. Luckily, social media firms are working to address this issue by better detecting disinformation campaigns and adding new features to flag objectionable content. However, it should be evident that the problem runs deep. It is incumbent upon all social media users to better manage their biases and search for the truth, while social media firms must continue to explore how to balance their business interests with fostering genuine discourse.

Endnotes:

[i] Kris Shaffer. Data Versus Democracy: How Big Data Algorithms Shape Opinions and Alter the Course of History. Apress Publishing. 2019.

[ii] Kris Shaffer. Data Versus Democracy: How Big Data Algorithms Shape Opinions and Alter the Course of History. Apress Publishing. 2019.

[iii] Linda Stone. “Beyond Simple Multi-Tasking: Continuous Partial Attention.” LindaStone.net (blog). 2009. https://lindastone.net/2009/11/30/beyond-simple-multi-tasking-continuous-partial-attention/

[iv] Kris Shaffer. Data Versus Democracy: How Big Data Algorithms Shape Opinions and Alter the Course of History. Apress Publishing. 2019.

[v] Brandon Barnett. “Countering Disinformation Campaigns on Facebook and Twitter Against U.S. Citizens”. Utica College, ProQuest Dissertations Publishing. 2019.

[vi] Jake Stubbs & Christopher Bing. “Special Report: How Iran Spreads Disinformation Around the World”. Reuters. 2018. https://www.reuters.com/article/us-cyber-iran-specialreport/special-report-how-iran-spreads-disinformation-around-the-world-idUSKCN1NZ1FT

[vii] Christina Nemr & William Gangware. “Weapons of Mass Distraction: Foreign State-Sponsored Disinformation in the Digital Age”. Park Advisors. 2019. https://www.state.gov/wp-content/uploads/2019/05/Weapons-of-Mass-Distraction-Foreign-State-Sponsored-Disinformation-in-the-Digital-Age.pdf

[viii] Christina Nemr & William Gangware. “Weapons of Mass Distraction: Foreign State-Sponsored Disinformation in the Digital Age”. Park Advisors. 2019. https://www.state.gov/wp-content/uploads/2019/05/Weapons-of-Mass-Distraction-Foreign-State-Sponsored-Disinformation-in-the-Digital-Age.pdf

[ix] Christina Nemr & William Gangware. “Weapons of Mass Distraction: Foreign State-Sponsored Disinformation in the Digital Age”. Park Advisors. 2019. https://www.state.gov/wp-content/uploads/2019/05/Weapons-of-Mass-Distraction-Foreign-State-Sponsored-Disinformation-in-the-Digital-Age.pdf

[x] Christina Nemr & William Gangware. “Weapons of Mass Distraction: Foreign State-Sponsored Disinformation in the Digital Age”. Park Advisors. 2019. https://www.state.gov/wp-content/uploads/2019/05/Weapons-of-Mass-Distraction-Foreign-State-Sponsored-Disinformation-in-the-Digital-Age.pdf

[xi] Renee DiResta. “Free Speech in Age of Algorithmic Microphones”. Wired Magazine. 2018. https://www.wired.com/story/facebook-domestic-disinformation-algorithmic-megaphones/

[xii] Erik Nisbet & Olga Kamenchuk. “The Psychology of State-Sponsored Disinformation Campaigns and Implications for Public Diplomacy”. The Hague Journal of Diplomacy 14, 65–82. 2019.

[xiii] Erik Nisbet & Olga Kamenchuk. “The Psychology of State-Sponsored Disinformation Campaigns and Implications for Public Diplomacy”. The Hague Journal of Diplomacy 14, 65–82. 2019.

--

--