Facebook’s business strategy: design, delay, deflect

Jessica Herbert
Reset Australia
Published in
4 min readNov 4, 2021

--

From coordinated foreign disinformation campaigns to Covid-19 conspiracies, misinformation on Facebook is a persistent problem rooted in the platform’s business model, which prioritises engagement above truth.

A timeline of Facebook’s increasing influence over contemporary politics reveals that incidents of serious societal harm, such as election interference and stoking ethnic divisions, are not one off events, but rather part of a repeated pattern of behaviour by Facebook.

First, the platform’s own design creates and amplifies harms, then leaders delay action to address problems despite awareness of the threat, and finally the company deflects blame in the aftermath of a crisis.

These three stages — design, delay, deflect — have become the hallmark for Facebook’s business strategy in the past decade.

Facebook’s algorithm is marketed as a tool to strengthen the connection between users by prioritising content from friends and family. However, the heavy weighting of re-shared material amplifies misinformation and hate speech while limiting the reach of authoritative news.

Reports uncovered by the Wall Street Journal show that in 2017, internal researchers raised concerns about the long term effects on democracy of publishers and political parties shifting towards divisive and misleading posts to generate reactions. Data scientists proposed solutions, which were ultimately rejected by Zuckerberg as he worried these would reduce engagement, leaving the platform geared towards lies and hate speech.

An internal experiment run by Facebook in 2020 concluded that content considered “bad for the world” was more likely to be seen by users. Negative content was demoted in an attempt to minimise this but the team observed reduced engagement as a result. Facebook implemented a second version of the algorithm change which demoted harmful content less strongly to ensure time spent on the platform did not decrease, demonstrating the company’s pursuit for profit regardless of the harm it exposes users to.

An internal program known as XCheck exempts millions of high-profile users from moderation, enabling prominent political accounts and celebrities to spread misinformation without consequences. These high-profile accounts pose greater risks yet are the least policed, a strong contradiction to the democratic principles Facebook claims to defend.

Facebook is aware of the harms caused by its products and systems, and yet these issues face delayed and often ineffective action.

Motivated by profit and power, the company only addresses issues in the face of a PR crisis or potential commercial loss. Documents have shown that Facebook took only limited action to prevent the platform being used for human trafficking until Apple threatened to remove Facebook’s products from the App Store.

Despite political ads being a disruptive and decisive force in the 2016 US election and Brexit referendum, Facebook continued to keep political ads exempt from fact checking, disregarding calls from civil society and politicians. Hundreds of employees wrote a letter to Zuckerberg that warned of the potential harm in upcoming elections around the world and proposed policy improvements, which were again ignored.

As misinformation spread rapidly in the lead up to the 2020 US election, Facebook chose not to proactively adjust its algorithm. Only in the aftermath did Zuckerberg approve emergency changes to prioritise authoritative news outlets, which whistleblower Frances Haugen claims were prematurely reversed in December.

The fact that Facebook has the capacity to reduce the reach of misinformation, but chooses not to, raises important questions about the company’s business model and incentives. The Capitol Hill insurrection is a stark reminder of the platform’s ability to incite real world violence and undermine democratic processes.

Fixated on growth, Facebook aims to mask the extent of its problems, often denying responsibility or claiming issues have been resolved . With public messaging that contradicts internal research, Facebook has been accused of misleading both the public and investors.

In a Congressional hearing, Zuckerberg downplayed Facebook’s role in the January riots as he called for other news providers to take responsibility and shifted blame on the individuals who broke the law. However, the problem was not only the calls to storm the Capitol but the broader environment of division and misinformation that Facebook has fostered for years.

Acutely aware of these issues, Facebook has tried to deceive us with their mission to connect the world, hiding its insidious impact on democracy.

Australia is not immune to the manipulation of public sentiment through mis- and disinformation, foreign interference, and targeted political ads. As Australia heads towards its next Federal election, the risks posed by Facebook to Australia’s democracy require urgent consideration.

Jessica is an intern at Reset Australia. She is currently studying a Bachelor of Arts and Bachelor of Advanced Studies (Politics and International Relations) at the University of Sydney.

--

--