Advertisers Drool as Hoaxers Rule

Understanding Facebook’s ‘fake news’ epidemic

Aceel
10 min readDec 12, 2016

In traditional broadcast and print journalism, human gatekeepers controlled the flow of news and information. They were heavily criticized for not serving objective news. Then came the Journalism Code of Ethics, which served to ensure the accuracy and fairness of information that was published or broadcasted (SPJ, 2014).

When the internet came along, data driven journalism came with the promise to facilitate the flow of unbiased and accurate information (Lotan, 2013, p. 105). The editorial torch was then passed from human gatekeepers to algorithmic ones when social media startups, such as Facebook, began giving humans reasons to consume media by being online all the time. Over the course of 8 years, Facebook went from zero to a billion users. The most significant thing that it built was the News Feed, which quickly changed from a simple way to reading posts from your friends to one based on a much more complicated algorithm that optimized for ‘engagement.’

Recent studies have shown that 66% of Facebook users get news from the site (Gottfried & Shearer, 2016), hence making the News Feed the primary driver of traffic to news sites. Facebook’s ecosystem is constructed to thrive on a positive feedback loop in terms of ‘Likes’ and ‘Shares.’ The more a headline is liked and/or shared, the more it goes viral and is allowed to spread to more users’ feeds and earn more impulsive Likes. This ‘virality’ is converted into ad dollars that go straight into the media outlets’ pockets. This economic model has attracted several money-hungry minds to create their own media outlets (Facebook pages, news websites…) and spread misinformation under sensationalist headlines that would inevitably be favored.

Of course, Facebook was accused of having a flawed algorithmic system that allows ‘fake news’ to outperform real or mainstream news stories on its platform. However, the ‘trending’ of these stories is not a glitch in the system (Silverman, 2016), but rather the Like Economy working at peak efficiency.

How harmful could that be? Does Facebook’s Like Economy pose any ethical challenges?

Technically, Facebook is “not a media company”, as Zuckerberg said, after being asked if Facebook intends to become a news editor (Segreti, 2016). But Facebook makes editorial decisions all the time. Besides curating users’ News Feeds according to their interests, or at least according to what the algorithm classifies as their interests, these decisions also steer people to one-sided or sensationalist and inaccurate information. The fact that these decisions are being made by algorithms rather than human editors doesn’t make Facebook any less responsible for harming its users by infringing on the key principles of balance, objectivity and accuracy.

Understanding the algorithmic machine

Algorithms are formulas that transfer and re-order data (Gillespie, 2012, p. 1) based on ‘relevance’. Even though algorithmic media comes in with its promise to strengthen impartiality in news gathering, it fails to do so due to the lack of an “independent metric for what actually are the most relevant search results for any given query”. (Gillespie, 2012, p. 9). Zuckerberg attempts to simplify things by saying, “A squirrel dying in front of your house may be more relevant to your interests right now than people dying in Africa.” If we focus on the terms ‘may be’ and ‘right now’ present in the previous statement, we can conclude that one of the ways Facebook evaluates relevance is based on probability and immediacy.

The Journalism Code of Ethics states the following, “Remember that neither speed nor format excuses inaccuracy” (SJP, 2014). Moreover, in The Handbook of Mass Media Ethics, Ward says that un-editorialized, impartial and factious news are what constitute journalism objectivity (2009, p. 73–74). But, what does Facebook have to do with any of that? Again, Facebook is not a media company.

Granted, Facebook must not to be held accountable for a news editor’s decision to write up a fake story with a CAPITALIZED ATTENTION-GRABBING TITLE and post it on the feed. However, it is the combination of Facebook’s Like Economy that creates implicit encouragement for the spreadability of inaccurate news, and it’s algorithmic system that places these stories on people’s news feeds with the probability of generating immediate interaction and quantifiable engagement that is sold to advertisers.

“The new media value immediacy, interactivity, sharing and networking, limited editorial checks and gatekeeping, and the expression of bias or opinion in an often ‘edgy’ manner” (Ward, 2009, p. 76).

This topic was brought into light especially after the U.S. 2016 election results. Several news sites, bloggers and users pointed their fingers at Facebook for affecting the outcome of the presidential elections. The graph above is taken from a recent analysis that proves that Facebook has done little to fight the spreading of misinformation on its platform; its algorithms have actually amplified the contents’ engagement amongst users (Silverman, 2016). Enough people liked, shared and clicked through the posts, which in turn generated significant profits for their creators.

The Washington Post recently covered a story on the people behind Liberty Writers news website (McCoy, 2016). Paris Wade and Ben Goldman manage the website together by writing inaccurate news stories and sharing them on their increasingly popular Facebook page. Both Wade and Goldman were minimum wage workers before they initiated this website. And in a matter of months, Liberty Writers was able to go from nothing to influencing millions of people and making big profits in the process.

After keen observation, Wade and Goldman realized that Trump-related news received the most attention. Therefore, they decided to give the audience what it wants: more Trump. The headlines frame their stories in a way that would attract more clicks (McCoy, 2016). Some of their articles do get banned, Goldman says, not because they are false, but rather due to an algorithm that activates Facebook’s spam filter that blocks rapidly shared articles because they are assumed to be spam. As the American political situation dramatized, Liberal Writers’ audience expanded tremendously. Their website became a money-making machine by “running advertisements that, among other things, promised acne solutions, Viagra alternatives, ways to remove lip lines, cracked feet, ‘deep fat’, and ‘the 13 sexiest and most naked celebrity selfies’” (McCoy, 2016).

One can’t help but question whether Goldman and Wade, or other online hoaxers, are influencing their audiences beyond the web because of their profit seeking motives. The problem here is not their motives, but the structure of Facebook itself. Millions of people were bombarded with emotionally charged news about the candidates they supported because that’s what the algorithms understood they were seeking. As these people engaged with those stories, they were served up even more of them, hence spinning the feedback loop in constant circles, allowing the stories to reach the top of Facebook’s ‘Trending List’ on several occasions (Herrman, 2016). Therefore, Facebook’s structure has affected how we engage with content. But has it changed how we make decisions, or how we even think?

In the light of the uprising protests and hate crimes across the United States, and knowing that Facebook is an accomplice in the perpetuation of fake news, the dangers that these stories cause must be addressed. Americans use Facebook to inform their view of America and the world, and this service has been poisoned; hence, their view has been poisoned as well. It can be easy to succumb to the algorithmic equation that functions on people’s natural tendencies to choose to read news that confirms their beliefs, regardless of the evidence. Since they feel uncomfortable when they’re exposed to media that pushes back on their perspectives, they usually end up avoiding it. This unbalanced media diet that they are being served is not properly feeding them. It is skewing their sight towards a one-sided, biased and downright illegitimate partisan perspective, which is affecting their actions.

Not only are media outlets, news editors and advertisers benefiting from this system that learns from its users, Facebook is as well, as it has a growing market value of over 350 billion U.S. dollars (La Monica, 2016). The distinction between ‘tech’ and ‘media’ companies is unhelpful in this case because there’s simply never been a company like Facebook before, able to singlehandedly distribute and filter information to over a billion people every single day. With that being said, Facebook has the social and ethical responsibility to minimize harming its customers, just like any other company. It must take its self-proclaimed title as a ‘tech company’ seriously (Segreti, 2016) to recognize that it “hasn’t [only] led to empowerment of the average citizen, but empowerment of professional propagandists, fringe elements and conspiracy theorists” globally, that have affected the outcome of presidential elections, political campaigns, and evoked populist movements worldwide (Mozur & Scott, 2016).

A Community of Many VS. a Community of One

Is this economically structured News Feed all that bad?

Sure, the ‘fake news’ problem is a big deal, but it’s only one problem. The core motivation of this feature is focused solely on keeping users engaged by allowing more freedom and control over the platform and the information that is relevant for them (Gillespie, 2014). The goal is to get more people spending time reading and interacting with their News Feeds and, in turn, beget more opportunities marketers to reach and engage with their audience, as previously stated. Assuming that these algorithms are successful at being keyed to relevance, the problem that remains is that they are not showing users what could be considered important or challenging to them. One of the key elements to a functioning democracy is to have a balanced flow of information (Ward, 2011, p. 89). The fact that users aren’t challenged with news and posts that oppose their perspectives and beliefs defies the purpose of Facebook: connecting people and creating a web-like community. Facebook is not going to be able to do that if it isolates its users in their own ideological bubbles.

No one deserves to live in this ignorant virtually augmented reality, where users can’t tell the difference between what is real or fake, and where they don’t have authorship and control over what they see on their News Feeds. First, we, as users, need to be more cognizant of the ways this software is influencing our behavior. As for Facebook, it should do more to prioritize posts that come from verified sources, and functionally de-prioritize or flag sites that peddle in fake news. This editorial process should look beyond engagement metrics in order to be neutral. Further, since Facebook knows exactly how much time we spend consuming media on its platform, it also knows how partisan we are likely to be. So, it can and must allow space for sources with opposing views to enter our New Feeds; this can be easily determined by studying the demographics and interests of our Friends List. Finally, Facebook should be more open with how its algorithm editorializes the type of content we see. Being transparent about this methodology will reduce any claims of partisanship and bias, offer users more control over the information they consume, and (hopefully) put the toxicity caused by fake news and hoaxers to an end.

“A less-toxic Facebook is doable. A less-toxic Facebook is crucial. A less-toxic Facebook is the absolute least you should demand from the people it’s made rich, because, with no great exaggeration, the ability to deliberately confuse tens of millions of American voters in exchange for banner ad revenues is a crisis.” — Sam Biddle.

Bibliography:

Journalists, S.O. (2014). SPJ Code of Ethics. Retrieved from http://www.spj.org/pdf/spj-code-of-ethics.pdf

Lotan, G. (2013). Networked audiences. McBride, K. & Rosenstiel, T. (eds.). The New Ethics of Journalism (pp. 105–199). London, U.K.

Gottfried, J. & Shearer, E. (2016, May 26). News use across social media platforms 2016. Pew Research Center Journalism & Media. Retrieved from http://www.journalism.org/2016/05/26/news-use-across-social-media-platforms-2016/

Silverman, C. (2016, Nov. 17). This analysis shows how fake election news stories outperformed real news on facebook. BuzzFeed News. Retrieved from https://www.buzzfeed.com/craigsilverman/viral-fake-election-news-outperformed-real-news-on-facebook?utm_term=.vyY9EpmObw#.pkvWxlrYNk

Segretti, G. (2016, Aug. 29). Facebook ceo says group will not become a media company. Reuters. Retrieved from http://www.reuters.com/article/us-facebook-zuckerberg-idUSKCN1141WN

Gillespie, T. (2012). The relevance of algorithms. In Boczkowski, P. & Foot, K. (eds.) Media Technologies (pp. 1–32). Cambridge, MA: MIT Press

Ward, S. J. A. (2009). Truth and objectivity. In Wilkins L. & Christians C. G. (eds.) The Handbook of Mass Media Ethics (pp. 71–83). New York, NY: Routledge

McCoy, T. (2016, Nov. 20). For the ‘new yellow journalists,’ opportunity comes in clicks and bucks. The Washington Post. Retrieved from https://www.washingtonpost.com/national/for-the-new-yellow-journalists-opportunity-comes-in-clicks-and-bucks/2016/11/20/d58d036c-adbf-11e6-8b45-f8e493f06fcd_story.html?utm_term=.f1cfe96fbf28

Herrman, J. (2016, Aug. 24). Inside facebook’s (totally insane, unintentionally gigantic, hyperpartisan) political-media machine. The New York Times Magazine. Retrieved from http://www.nytimes.com/2016/08/28/magazine/inside-facebooks-totally-insane-unintentionally-gigantic-hyperpartisan-political-media-machine.html?_r=0

La Monica, P. R. (2016, Apr. 28). Why facebook could one day be worth $1 trillion. CNN Money. Retrieved from http://money.cnn.com/2016/04/28/investing/facebook-trillion-dollar-market-value/

Mozur P. & Scott M. (2016, Nov. 17). Fake news in u.s. election? elsewhere, that’s nothing new. The New York Times. Retrieved from http://www.nytimes.com/2016/11/18/technology/fake-news-on-facebook-in-foreign-elections-thats-not-new.html

Gillespie, T. (2014). Facebook’s algorithm — why our assumptions are wrong, and our concerns are right. Culture Digitally. Retrieved from http://culturedigitally.org/2014/07/facebooks-algorithm-why-our-assumptions-are-wrong-and-our-concerns-are-right/

Ward, S. (2011). Freedom of speech and deliberative democracy. Ethics and the Media (pp. 88–117). Cambridge, United Kingdom: Cambridge University Press

--

--