“Fake News” is a Feeling

Facebook says it wants to ensure the news its users see is “high quality.” That may be just another term for “feel-good”

Luke Stark
Berkman Klein Center Collection
3 min readFeb 9, 2018

--

Last week Facebook announced it would change the composition of its News Feed to combat misinformation and so-called “fake news.” Alongside prioritizing posts from friends and family, the social networking platform also announced it would ask users to identify their level of trust in news sites, in order to incorporate this collective judgement into the ranking algorithms shaping what sorts of personalized content a user might view. Facebook’s ends are potentially laudable, but the company’s means are suspect: in relying on its users’ judgments about the trustworthiness of news sources, the site risks deepening a growing confusion between material that’s factually informative, and material that resonates with users’ positive and negative feelings — without necessarily containing “high quality” facts.

Facebook’s users are people like you and me, and we’re not always good at distinguishing feelings from facts. Psychological studies have shown that in certain circumstances, strong human emotions directed towards others, like gratitude or anger, lead people to be most trusting even in situations where the emotion has nothing to do with the trust relationship. Deliberately activating strong emotion in people is also a longstanding hallmark of propaganda campaigns, and much of the misinformation and disinformation now termed “fake news” does just that. In a recent report, Carolyn Jack of the Data & Society Research notes that, “whether a persuasive campaign is publicity or propaganda… is largely a matter of perspective.” Emotional and visceral appeals are rife in today’s digital media ecosystem — techniques like catchy click-bait headlines, direct “hard-sell” personal appeals, and inaccurate yet emotionally arousing claims. News sites, advertising firms, and domestic political groups like the NRA deploy the same methods used by online propagandists working for state actors such as Russia.

In this environment, it would be hard under the best of circumstances for a platform like Facebook to sort out factually high-quality news sources without making editorial decisions about content — a step the company is loath to take. Yet Facebook’s business model itself is as reliant on emotional appeals as any of the third parties whose content it hosts. Facebook benefits from the spread of “fake news” on the site because “fake news” is actually about the real feelings of its users: in getting riled up by an political advertisement or tickled by a cute cat video, users are more inclined to share, like or comment on a post, spend more time on the site, and interact more with those in their social network. Facebook’s engineers are constantly working to optimize the design of the site to nudge us towards more of these social and emotional interaction — without “fake news,” Facebook’s business model, like that of other social media platforms like YouTube or Twitter, would look much less rosy.

The survey Facebook will deploy to its users to determine the trustworthiness of news sources is just two questions: whether a user has heard of a news site, and whether the user trusts the site “not at all,” barely, etc. In the context of Facebook’s emotion-saturated ecosystem, this survey does nothing to distinguish between sources trusted because of their content, and sources that align with a user’s strong feelings, positive or negative. Facebook has taken a number of steps over the years to gather granular data about its users’ feelings: among the biggest was the introduction of Reactions icons in early 2016, which supplemented the Like button. In combining Reactions data with other information the site collects about posting frequency and other behaviors, Facebook can determine what — and who — gets its users excited or angry, and when. This information is invaluable for the platform’s advertising business, and for the publishers that use the site to reach its global audience — and thus is invaluable to Facebook as well.

Facebook should be applauded for making the effort, but the site needs to do better. Until the platform and others like it recognize “fake news” is as much about the feelings of their users as it is about news — and that they profit from emotionally driven misinformation and propaganda — our “fake news” problem is only going to get worse.

--

--