Why I Quit All Facebook-Owned Apps — Part Two
This is Part 2 of a three-parter story, if you haven’t read the first one I invite you to check it out here.
I am listing reasons why I quit all Facebook-owned apps last July, the last article talked about privacy issues with Facebook products. This article will focus more on Facebook’s political involvement.
It’s a political game
I am sure I’m not the only one who has become more and more aware of the global ascension of extremes on social media. It really is an issue and if you don’t understand how a company like Facebook could be to blame here, let’s take a walk.
Mark Whitehead says in his article Why people leave Facebook — and what it tells us about the future of social media:
The number of active users of Facebook (those people who have logged onto the site in the previous month) has reached a historic high of 2.45 billion. To put this in some context, approximately 32% of the global population now use the social media platform, and the trend line of participation is still going up.
This alone should help you realize the influence on the population such a platform can have. Just think about the power of reach, the amount of data collected!
“It means that companies can very accurately predict our behaviour or predict how we can be persuaded to make one decision or another. That means that we’re actually losing a little bit of control of our daily freedom.” — Brittany Kaiser
Facebook is aware of its influence but chooses to play dumb. In 2018, Facebook ignored warnings from an internal presentation showed research that was made on their algorithm, it was said to “exploit the human brain’s attraction to divisiveness, […] if left unchecked,” Facebook would provide users with “more and more divisive content in an effort to gain user attention & increase time on the platform”.
Content moderation and fact-checking not being taken seriously mean we end up with a platform that becomes a nest for extreme discourse, a catalyst for lies to circulate and it allows (and promotes) from abusive language to gun use and violence.
Keep this in mind as we go along.
“Facebook is in the awkward position of having to explain why they think they drive purchase decisions, but not voting decisions.” — Casey Newton
The Brexit situation
I remember vividly how the Brexit referendum went down. I had been living in London for the past 4 years and working as a carer for a year.
I got to see how confused people were about the whole campaign. As an outsider allowed in a variety of Brits’ homes and daily lives, it was very interesting to witness such a pivotal political moment from inside. Nobody knew what was going on. There was something terrifying about it, I could feel what was happening was very wrong but couldn’t really understand why.
It turns out Facebook had a role in this general confusion as it spread disinformation and fuel hatred. There were ads broadcasted to Facebook users to convince them to vote Leave, like the now-infamous lie “We send £350 Million to the EU every week”.
Now, if you know the definition of the word propaganda, you’ve already identified what was going on. If you do not, let me remind you quickly:
Propaganda: information, especially of a biased or misleading nature, used to promote a political cause or point of view.
Misinformation perpetuated by Facebook has severe political and social consequences: it is, in fact, worrying for Democracy and the world as we know it. Journalist Carole Cadwalladr has been talking about it for a while now (If you’re not terrified about Facebook, you haven’t been paying attention), I invite you to watch her TED Talk about Facebook’s role in Brexit:
“And maybe you think, “Well, it was just a few ads. And people are smarter than that, right?” To which I would say, “Good luck with that.” Because what the Brexit vote demonstrates is that liberal democracy is broken. And you broke it. This is not democracy — spreading lies in darkness, paid for with illegal cash, from God knows where. It’s subversion, and you are accessories to it.” — Carole Cadwalladr
Other instances of Facebook’s political influence in the world
Brexit is just one isolated case that I chose to present to you in this article to explain briefly how Facebook interferes with political matters. Unfortunately, there have been consequences all around the world and I’m going to list a few below.
How Facebook influenced the Egyptian revolution
Watch Wael Ghonim’s TED Talk where he explains how Facebook was instrumental in the Egyptian revolution of 2011 and asks “What can we do about online behavior now? How can we use the Internet and social media to create civility and reasoned argument?”
Wael Ghonim brings up great points about social media being designed to spread and confirm people’s biases; muting, un-following, and blocking can be used to reject any opinion that we don’t agree with and locks us in our own personal echo chambers. These are the reasons why online discussions “quickly descend into angry mobs” a lot of the time. This is because polarization means we, as a society, are not on the same page when it comes to what is true or real anymore. Arie Kruglanski, a social psychologist, warns that this fracture in our shared understanding of reality is dangerous and a symptom of societal psychosis.
“Today, our social media experiences are designed in a way that favors broadcasting over engagements, posts over discussions, shallow comments over deep conversations. It’s as if we agreed that we are here to talk at each other instead of talking with each other.” — Wael Ghonim
Violence in Myanmar
In 2018 the UN accused Facebook of being “slow and ineffective” in its response to the spread of hatred online. Facebook admitted to not having moderated hate-speech against the Rohingya and anti-Muslim content; pretty much admitting to being useful in spreading misinformation that has ultimately led to the real-world consequence of the displacement and genocide of 700,000 Rohingya by the state.
Alex Warofka, Facebook’s product policy manager said: “We agree that we can and should do more” and promised to “get it right” before the 2020 elections.
Update: Since Myanmar's Military Coup on February 1st, 2021, Facebook’s director of public policy in the Asia-Pacific region Rafael Frankel deployed measures to put “ the safety of people in Myanmar first and [remove] content that breaks rules on violence, hate speech and harmful misinformation”.
WhatsApp and the 2019's Indian Election
Just like fake news and propaganda are spread on Facebook, they are also spread through WhatsApp. In India, content shared via WhatsApp has led to at least 31 people killed in 2017 and 2018 as a result of mob attacks fuelled by rumors on WhatsApp and social media.
WhatsApp being the most popular messaging platform in India, both the Bharatiya Janata Party (BJP) of Prime Minister Narendra Modi and the opposition Congress have seen the potential of WhatsApp in influencing 900 million voters. Both parties have been accused of spreading false or misleading information, or misrepresentation online.
What makes this stream of misinformation more dangerous is the privacy of the platform: false information and propaganda shared behind closed doors are less likely to be identified and reported.
Facebook’s influence in Donald Trump election and attempted coup
Of course, U.S. politics has not been protected from political events being swayed by social media influence. Fake news and pro-Trump ads have been running on Facebook in 2016 and the most surprising aspect of this story is that more than 100 of these pro-Trump websites were created and maintained by Macedonian teenagers who just wanted to make money.
Zuckerberg denied Facebook’s implication in the election as people made their choice based on “lived experience”. He maintained that Facebook’s algorithm is intended to reflect “what people want” and what is “meaningful and interesting to them” which is exactly the issue.
Since Donald Trump’s election, Facebook has often been criticized for not taking responsibility for enabling hate and violence. In November, groups like “Joe Biden Is Not My President” or “Red State Secession” started showing up, and in January 2021 Trump supporters stormed the US Capitol on January 6, 2021, performing an attempted coup. Once again, this is the result of the divide in our shared perception of reality and the lack of moderating hate-speech and misinformation.
Facebook and Twitter have since suspended Donald Trump’s account to ensure a peaceful transition of power after the event sparked outrage.
The bottom line
Why is Facebook fueling discord? Because people are more likely to spend time on the platform if the content is hyper-personalized to reinforce their own opinions. The global discrepancy trend is a symptom of Facebook’s policies, not the other way around. In the past Facebook argued that they are a tech company and not a media company, meaning he doesn’t feel like he has a responsibility to edit content.
In Part One, I asked if you thought Facebook was sustainable. After reading Part Two, I’d like you to ask yourself again if you believe Facebook can go on the way it has.
What do you think?
These are a few examples and only a glimpse into the influence of social media in general on our societies. But I’m hoping it can give you some things to think about and maybe we can all connect through the will to change the systems in place. What steps are you taking to hold Facebook accountable? Or other social media platforms? How are you planning on making a change?
Tell me your story in the comments; find me on Twitter @D_Brunetiere or connect with me on LinkedIn. Let’s talk!
“Ethics cannot be a side hustle.”― Mike Monteiro, Ruined by Design: How Designers Destroyed the World, and What We Can Do to Fix It
Don’t take my word for it
As a TL;DR I encourage you to watch Hasan Minhaj’s Patriot Act episode on Moderation and Free Speech:
Below are a few other resources if you want to go further:
Ledger of Harms
Misinformation, conspiracy theories, and fake news Evidence Loss of crucial abilities including memory and focus A…