Op-Ed: Social media algorithms & their effects on American politics

By Angela Colabella, MEng ’22 (IEOR)

--

This op-ed is part of a series from E295: Communications for Engineering Leaders. In this course, Master of Engineering students were challenged to communicate a topic they found interesting to a broad audience of technical and non-technical readers. As an opinion piece, the views shared here are neither an expression of nor endorsed by UC Berkeley or the Fung Institute.

Photo by dole777 on Unsplash

When was the last time you had a conversation with someone who held opposing political views? In those conversations, did either party bring up a disputed “fact”? Have you noticed anyone you know becoming increasingly entrenched in their political views?

According to a 2020 study conducted by economists at Brown University, the United States is becoming politically polarized at a higher rate than most other major democracies. This, in part, can be attributed to the rise of the internet and social media over the last decade. In recent years, social media sites have become partial to using recommendation algorithms designed to increase user participation and time spent on these apps. Unfortunately, some side effects of these algorithms include informational echo chambers, the spread of false information, or even introducing users to communities with very extreme, potentially harmful ideologies. These factors contribute to increasing nationwide distrust of media and government, growth in domestic terrorism, and the rise of political polarity within the US over the past decade.

Most social media sites gather data from your browsing history to make your experiences on their sites more personalized through algorithms. They use recommendation and prediction algorithms to suggest pages you may like based on whether you watch a specific Netflix show, who you follow on Instagram, or what you have been searching on Google. While these types of algorithms help customize the user experience, there have been many ethics debates over how they are run and their effects on modern information consumption.

In a Pew Research Center article, authors Rainie and Anderson delve into how algorithms create echo chambers that result in major political divides as people lose the ability to converse with others of different political opinions. The authors attribute this to the fact that algorithms “create filter bubbles and silos shaped by corporate data collectors; they limit people’s exposure to a wider range of ideas and reliable information and eliminate serendipity.” Not only are we becoming more divided politically as a nation, but some are even becoming so radicalized in their beliefs that there has been an uptick in the number of people participating in hate groups and domestic terrorism, to the point where even domestic terrorism researchers and veteran security officials have become extremely concerned.

Algorithms create filter bubbles and silos shaped by corporate data collectors; they limit people’s exposure to a wider range of ideas and reliable information and eliminate serendipity.

Photo by ROBIN WORRALL on Unsplash

One recent study conducted on the app TikTok demonstrated how quickly a user can become radicalized within a very short period and based on a limited amount of input feeding into those sectors. The most profound outcome of the study was the effect of engaging with transphobic content. When a user began interacting only with transphobic content, the TikTok algorithm rapidly increased the volume and variety of far-right video recommendations. The team coded about 450 videos fed into the “for-you page,” the homepage based on a recommendation algorithm. Although they interacted only with transphobic content, they found that the for-you page became rapidly populated by videos promoting misogyny, racism, white supremacist beliefs, anti-Semitism, conspiracies, hate symbols, and other generally hateful or violent videos. These TikTok videos have an average length of 20 seconds, meaning that after 450 videos they have spent about two hours on the app. Hypothetically, a user can become politically radicalized in under three hours. Ironically, transphobic content is supposed to be banned on the site as it violates TikTok’s behavioral guidelines.

Meta, the company which owns not only its original namesake app Facebook but also Instagram, WhatsApp, and several other apps, has a huge reach in the realm of social media. In the past five years, Meta has faced many allegations regarding its algorithms, privacy, and many other concerns about its usage of consumer data. Around the time of the 2016 presidential election, Meta (at the time Facebook) conducted a study that found its algorithms were responsible for a high volume of growth in extremist content engagement. Specifically, the recommendation tools resulted in 64% of all extremist group joins. Most of the extremist group joins were a product of the “Groups You Should Join” feature and “Discover” page algorithms. The link between Facebook’s recommendation systems and its direct results of growth in extremist groups is a serious finding concerning our current political divide.

In October 2021, Meta had several employees-turned-whistleblowers release internal documents detailing several major ethical problems within the company. These documents, known as “The Facebook Papers,” reveal how the social media giant has been meticulously tracking global issues resulting from their platforms and algorithms. In addition, the papers included proof that the company lifted measures implemented in 2020 to prevent misinformation spread as soon as the election ended. This company is highly aware of how their algorithms work, the reach of their data mining, and the impact of their neglect (either unintentional or intentional), and yet they rarely address the fact that their platforms enable a large amount of misinformation spread. Instead, they have weighed the options between benefiting society and benefiting their company’s goals; often, they side with the latter.

I have talked about the ways extremist political ideas and groups have been allowed to grow and spread on social media platforms and websites, due in part to the design of their recommendation and prediction algorithms. The companies behind these platforms are fully aware of the effects of their willful negligence, and yet they have done very little to improve the way these apps and algorithms are designed. Since social media has become intricately tied into so many factors of everyday life, it would be nearly impossible to completely abandon the use of social media platforms as sources of information. Instead, there should be more public awareness and transparency of how these companies handle our data, the information that is spread on these sites, and much more legislation to protect the public from harm. In the meantime, users must be aware of the many potentially hazardous effects of social media in their personal lives and society.

There should be more public awareness and transparency of how these companies handle our data, the information that is spread on these sites, and much more legislation to protect the public from harm.

References

Allam, Hannah. “Right-Wing Embrace Of Conspiracy Is ‘Mass Radicalization,’ Experts Warn.” NPR, NPR, 15 Dec. 2020, https://www.npr.org/2020/12/15/946381523/right-wing-embrace-of-conspiracy-is-mass-radicalization-experts-warn

Auxier, Brooke, and Monica Anderson. “Social Media Use in 2021.” Pew Research Center: Internet, Science & Tech, Pew Research Center, 9 Apr. 2021, https://www.pewresearch.org/internet/2021/04/07/social-media-use-in-2021/

Boxell, Levi, et al. “Cross-Country Trends in Affective Polarization.” National Bureau of Economic Research, Jan. 2020, https://doi.org/10.3386/w26669

Cosentino, Gabriele. “From Pizzagate to the Great Replacement: The Globalization of Conspiracy Theories.” Social Media and the Post-Truth World Order: The Global Dynamics of Disinformation, Palgrave Pivot, Basingstoke, 2020, https://doi.org/10.1007/978-3-030-43005-4_3

Culliford, Elizabeth, and Diane Bartz. “U.S. State Attorneys General Probe Instagram’s Effect on Kids.” Reuters, Thomson Reuters, 18 Nov. 2021, https://www.reuters.com/technology/new-york-attorney-general-others-opens-probe-into-facebook-promoting-instagram-2021-11-18/

Feezell, Jessica T., et al. “Exploring the Effects of Algorithm-Driven News Sources on Political Behavior and Polarization.” Computers in Human Behavior, vol. 116, Mar. 2021, p. 106626., https://doi.org/10.1016/j.chb.2020.106626

Horwitz, Jeff, and Deepa Seetharaman. “Facebook Executives Shut Down Efforts to Make the Site Less Divisive.” The Wall Street Journal, Dow Jones & Company, 26 May 2020, https://www.wsj.com/articles/facebook-knows-it-encourages-division-top-executives-nixed-solutions-11590507499

Jones, Seth G., and Catrina Doxsee. “The War Comes Home: The Evolution of Domestic Terrorism in the United States.” The War Comes Home: The Evolution of Domestic Terrorism in the United States | Center for Strategic and International Studies, 14 Dec. 2021, https://www.csis.org/analysis/war-comes-home-evolution-domestic-terrorism-united-states

Lima, Cristiano. “A Whistleblower’s Power: Key Takeaways from the Facebook Papers.” The Washington Post, WP Company, 26 Oct. 2021, https://www.washingtonpost.com/technology/2021/10/25/what-are-the-facebook-papers/

Little, Olivia, and Abbie Richards. “TikTok’s Algorithm Leads Users from Transphobic Videos to Far-Right Rabbit Holes.” Media Matters for America, 5 Oct. 2021, https://www.mediamatters.org/tiktok/tiktoks-algorithm-leads-users-transphobic-videos-far-right-rabbit-holes

Rainie, Lee, and Janna Anderson. “Algorithmic Categorizations Deepen Divides.” Pew Research Center: Internet, Science & Tech, Pew Research Center, 31 Dec. 2019, https://www.pewresearch.org/internet/2017/02/08/theme-5-algorithmic-categorizations-deepen-divides/

Connect with Angela.

Edited by Alison Huh.

--

--

Berkeley Master of Engineering
Berkeley Master of Engineering

Master of Engineering at UC Berkeley with a focus on leadership. Learn more about the program through our publication.