We Need to Start Holding Facebook Accountable for the Current State of Political Discourse

Owen Wade
The Graph
Published in
4 min readAug 16, 2017

Facebook frequently tinkers with its newsfeed algorithm. Unfortunately, these updates are typically opaque, under-the-hood changes that can significantly impact the type of content users are exposed to on a daily basis. Recently, I have noticed an influx of inflammatory Facebook comments on news articles while using Facebook’s iOS mobile app (version 136.0). One quick note, Facebook for browsers handles newsfeed comments differently. Unfortunately, it is difficult to find documentation of algorithmic changes to the newsfeed. Facebook’s developer site offers a changelog, but it’s mostly dedicated to bugs and technical updates. Currently, Facebook’s mobile app displays only one comment per shared story. Additional comments are not displayed until you expand the comment section. Below are some screenshots of news articles paired with their featured comments.

Screenshots taken 08/12/2017
Screenshots taken 08/12/2017

It’s no secret that Facebook’s business primarily caters their service to advertisers, and it can be difficult for average users to find information about how Facebook’s newsfeed algorithms work. A MacRumors.com user and Release Engineer for Facebook (unverified) suggests:

“Release notes are useful for small applications with a few changes each release but are useless for large, complex applications with hundreds of developers. We’re not trying to keep secrets from you. There are just simply better ways of telling you what’s interesting when those features are ready for you.”

When trying to find up-to-date information about Facebook’s newsfeed algorithm, your best bet is to check the Facebook Business site’s “Help Community.” Here I found a post from 2016 about negative reviews on a Facebook business page:

“When I visit my business’s Facebook Page on a mobile phone, the first two reviews to pop up are negative reviews despite my page receiving an overwhelming majority of positive views. They aren’t the most recent, and they don’t have the most likes. Can someone please explain this to me? Obviously no one likes bad reviews, but I feel that in this instance it is particularly unfair for two of the few negative reviews to appear first.”

A member of the Facebook help team responded to the inquiry explaining:

“The reviews that receive the highest amount of engagement will be displayed first. The higher the response, the higher it will be displayed on the list.”

This comment chain appears to explain the phenomenon that I have encountered recently. Facebook seems to be bumping comments on news articles the same way that they handle business reviews. By using this same (or similar) algorithm, they’re creating an environment in which the most inflammatory comments rise to the top. In effect, they’ve magnified some of the site’s worst comments, and coupled them with solid reporting so that they’re essentially unavoidable when using the mobile app’s newsfeed. This systemic issue appears to be emboldening internet trolls and bigots. Many of the comments seem to be intentionally incendiary, exploiting the concept of “engagement” in order to maximize visibility. I have observed that most (if not all) comments paired with corresponding news articles, have more replies than all other comments on the story. It seems likely that a comment’s “likes” are a complete non factor.

John Naughton, a reporter for The Guardian writes:

“There are about 3.4bn users of the internet worldwide. Facebook has now nearly 2bn users, which comes to around 58% of all the people in the world who use the network. It was inevitable therefore that it too would become a mirror for human nature — and that people would use it not just for good purposes, but also for bad.”

screenshot taken 08/14/2017

For better or worse, Facebook is an important space for contemporary political discourse. Unfortunately, the website’s underlying design is creating an incendiary environment for both users and journalists. Regardless of the intention, the current algorithm elevates reactionary clickbait in the form of article comments. It also degrades the newsfeed’s ability to facilitate thoughtful discussion and deliver insightful journalism. Right now the advertising objectives at Facebook are at odds with creating a hospitable forum for thoughtful discussion. Design decisions that elevate the worst comments negatively impact conversations as a whole. It’s also disappointing that good journalism is automatically paired with toxic inane nonsense. It’s about time that Facebook acknowledge their complicity in providing a space for this hateful rhetoric to exist.

--

--