Do algorithms create polarization?

Siddhesh Bangar
5 min readApr 3, 2022

This blog is based on breaking news that has captivated the internet for quite some time. The “Facebook papers,” which were leaked by a former Facebook employee, raise a slew of questions for individuals all around the world. What are they, exactly? What are the motivations behind those articles, and how might the exposure of such papers cause us to reconsider our life choices?

“ If you are not paying for the product, then you’re the product.”— Social Dilemma

Frances Haugen (37), an engineer, has worked as a product manager for several large global organisations, including Google+, Pinterest, and Yelp, during her impressive IT career. She completed her management studies at Harvard and was a co-founder of a dating app, according to her Linkedin page. When Facebook approached her in 2018, Frances accepted, but only if the role was tied to democracy and dealt with bogus propaganda and news. In the year 2019, Facebook settles the conditions with Frances and offers her a job at Civic Integrity (a team that deals with election interference data), which closes in 2020 after the US elections. Despite this, Frances agreed to keep her position at Facebook. After some time had passed, Frances noticed that Facebook was not making any steps to clean up the incorrect info that was spreading and causing polarization among the public. Finally, in May 2021, Frances resigned from a post after being enraged by Facebook’s work. But before she left, she downloaded all of the documents from Facebook’s archives, which led to their terms and conditions of spreading misinformation, and Frances Haugen became a whistleblower against Facebook when she shared those documents with the Wall Street Journal, which was later shared among 17 U.S. news organisations, and the company’s biggest leak was exposed in front of citizens.

This huge leak of the company’s internal issues was revealed to the public and was called by the name “Facebook papers.” Which states that the Facebook algorithms have put democracy in danger in many countries, hatred and violence have been spread on a large scale as well as children and youths are in danger.

As I previously discussed in one of my blogs that “Internet bots” which in today’s world have deployed all over the internet can lead to swaying public opinions on many different and crucial topics.

This papers leak can make all of us reconsider our thoughts on consuming the internet. Is it possible that we’ve entangled in this algorithm’s loop? Or, in today’s environment, have tech behemoths begun to exert control over our thoughts?

Researchers at Facebook created three fake profiles in 2019 to examine the platform’s system for recommending items in the News Feed. The first was for a user in India, the company’s most important market. The company then generated two more test accounts, one for a conservative American user and the other for a liberal person.

Within a few weeks, all three accounts only engaged with stuff that Facebook’s algorithms recommended. “I’ve seen more images of dead people in the past 3 weeks than I’ve seen in my entire life total,” the Facebook researcher who was running the Indian test user’s account wrote in a report that year, adding that “the graphic content was recommended by [Facebook] via recommended groups, pages, videos, and posts.”

Several points were triggered during the examination of the papers, including:

  • In the United States, Facebook spends about 87 per cent of its budget fighting fake news and misinformation. whereas the rest of the countries receive only 13% of the funding.
  • Facebook wants to fight against such data even after it was discovered. It was impossible simply because there was no technology to detect hate speech in various languages (language classifiers).
  • The company fails to remain neutral in the face of some critical hate content, causing the algorithms to degenerate further. Which was proven by “Adversarial Harmful Networks: The India Case Study”

In 2018, Facebook introduced various adjustments to the news feed, which previously included content from the accounts we follow that had been recently added to the timeline. However, it has now been modified to the more contentious posts (which have more likes and comments). Furthermore, the algorithm works in such a way that the postings we like the most are ranked and placed at the top, allowing us to see the opinions on the issues we prefer, which may or may not affect our minds. Previously, when the algorithm displayed recent posts at the top, we frequently received opposing viewpoints on a variety of topics. Has everything is matched according to our preferences the social media bubble is been generated, as portrayed in the Netflix documentary “The Social Dilemma.”

As we can see, Facebook prioritises messages that are more disputed and popular, which might have a harmful impact on young people and the general public. However, such actions are evident on numerous social media platforms, including Twitter, Youtube, and Reddit, despite the fact that it is a human tendency to gravitate toward more controversial and contested topics.

We may now believe that the development of algorithms in the field of technology is a poor idea. Algorithms were built on the basis of human nature to fulfil a specific goal. You see what you want to see, which leads to polarisation and that leads us to look more towards these conspiracy theories. There is no opportunity for counter-opinion since there is no one to dispute it.

The papers also suggest that Facebook has contributed to girls’ body image difficulties. According to research, one out of every three teenage females developed anxieties about their body appearance and form after joining Facebook.

So, what are the remedies for such deeds?

The most important solution is for Facebook to take action on such algorithms and companies’ work plans as soon as feasible because lives are on the line. But what we can do on our own is to be conscious and aware while scrolling through social media that what we see is based on our likes and comments, and that day by day we are falling into a rabbit hole of a conspiracy theory, which can disrupt your mental health and cause anger issues toward a specific group of people or viewpoints. Always be aware, because you might not realise that while scrolling with our consciousness we might end up being a puppet of such algorithms.

--

--

Siddhesh Bangar

Computer Science Student @Trinity College Dublin | Exploring Jobs in Tech | Queries 👉 siddheshb008@gmail.com