CONFIRMED — Facebook is a Cancer to Society

Srinthan Hampi
Kubo
Published in
5 min readOct 6, 2021

The world was shocked last year after the release of ‘The Social Dilemma’, a Netflix documentary that delved deep into the false realities created and enforced by social media, and its algorithms. These algorithms bombarded users with content that exploited their human vulnerabilities of fear and hate. According to the slew of damning allegations put up by the documentary, social media giants found it more profitable and engaging for users when they were exposed to content, news and media that enraged them, or caused them to rise up in fear. A wave of newly found scrutiny was unleashed on social media giants, who largely denied the one-sided portrayal of large and evil corporations. Because, hey, no company in their right mind would ever leverage human hatred and fear for money, right?

Wrong.

Frances Haugen, a former employee at Facebook, recently came out with shocking revelations in her 60 Minutes interview, about Facebook’s role in enabling hate all around the planet.

Haugen exposed Facebook’s practices via a 60 Minutes Interview, followed by a senate deposition.

Haugen alleges that Facebook’s algorithms had been designed to bring forward the kind of content that its users would resonate with the most — usually consisting of extremely divisive and hateful media. This kind of content would be the most engaging, as users were always more enthralled and interested in content, media and narratives that caused them to hate and fear sections of society in general.

These allegations have been backed up by a trove of confidential documents and Facebook’s internal correspondence that reinforces the idea that Facebook prioritized profits over social harmony.

Anybody who has studied the effects of social media on the behavior of individuals, know for sure that Facebook is designed this way. Almost all social media platforms base their entire revenue model on one which guarantees them the most number of hits, clicks, and engagement. Even though this cavalier attitude is ever-present in the tech space, Facebook seems to be responsible for far more disinformation than any other social media platform.

Haugen made the same allegations at a Senate hearing that took place earlier today.

Take for example, an average user who has signed up to Facebook, and uses the platform regularly, as a means of keeping in touch with his social life, and also with current affairs and news. If the user has been shown to be conservative, the Facebook algorithm would feed the user a steady stream of content that is designed to make the user anxious, fearful, and coming back for more. The validity of the claims in the content provided to him are irrelevant in this instance, as Facebook doesn’t have the burden of being factually accurate, they only have the purpose of exploiting your attention for money.

What this achieves is an artificially fabricated social reality — one that is backed up by a steady and malicious stream of disinformation, one that builds a complex narrative of oppression, fear and hate. And the worst part, Facebook knew about it all along.

It is extremely easy to figure out how entire communities of people are formed based on lies. Flat Earthers, anti-vaxxers, QAnon conspiracy theories etc, all exist because they are consumed on a platform that actively tries to drown the user in information that confirms their biases. If a user approached the platform with an existing opinion that is factually incorrect, Facebook’s algorithms do its best to reaffirm the inaccurate beliefs of the user, since this is what guarantees that the user keeps coming back to the platform for more disinformation and engagement. Therefore, it is quite easy to draw a link between the worst aspects of a democratized internet, and Facebook’s actions themselves.

According to Haugen, social media platforms are designed to keep users engaged with content that makes them fear and hate.

Haugen states in her interview that this was a conscious decision undertaken by Facebook executives — to prioritize engagement over moral righteousness and safety of its users. Previous reports of malicious actors in Myanmar using Facebook to incite genocide had also been made public in 2019 and 2020, undoubtedly due to the inhumane algorithms and policies maintained by Facebook itself.

Furthermore, Facebook’s choices cause even more harm, when you consider the fact that many nations across the world treat the ‘internet’ and ‘Facebook’ as the same thing. In these areas, populations are helplessly and deliberately misinformed and divided, by a corporation that profits off of their communal fear and hatred.

It cannot be stated enough, as to how much flak Facebook themselves deserve for this. Internal communications through memos, employee forums, and undoubtedly countless other confidential documents have revealed that Facebook knew about the effects its algorithms had. Facebook knew that entire communities were being formed as a consequence of false narratives being built, mostly by the Facebook algorithm. Facebook also knew that these communities were centered around fear and hate mongering, and that these groups were also violent.

Facebook’s algorithms may have been instrumental in inciting the January 6th insurrection attempt.

To be completely honest, we can also equate the January 6 insurrection attempt, partially to Facebook’s algorithms, which probably reinforced false beliefs that many of the insurrectionists held when they stormed the Capitol Building in Washington DC.

Facebook will be the subject to another US Senate hearing soon, which will hopefully discuss the leaks and allegations made by Frances Haugen. If the US Senate does its job and holds Facebook accountable for its actions, we may see the downfall of one of the largest social media giants on the planet. However, given the Senate’s history, and poor grasp of anything modern, we may not see this…yet.

Project Tinker is a Bangalore based startup aimed at helping ideators with the tools they need to build amazing ideas. To learn more about our services and philosophy, visit project-tinker.com

--

--