Is Facebook Suppressing Conservative News? Some Context and Questions to all the Innuendo

Will Rinehart
May 12, 2016 · 6 min read

On Monday, Gizmodo continued its reporting on Facebook’s Trending Topics section with a report that the section was biased against conservatives voices. The allegations were potentially scandalous if true, but after reading the entire story and looking at the complete picture, one cannot help but have a more tempered view of Facebook’s Trending Topics section. Instead of trying to bludgeon online companies to conform to some opaque standard of objectivity, we need to shift towards more fruitful endeavors; more education and more tools are the only way forward.

Last week, Gizmodo reported on the internal machinations behind the Trending News section on Facebook, again setting off the political bias debate. Even though stories are first suggested by algorithm, Gizmodo found that the Trending Topics list is actually curated by humans, which should have hardly come as a surprise considering they have pithy headlines and short explainers. This week, the site reported further details from interviews with curators. In particular, it seems that conservative views were given short shrift on the final list:

Facebook workers routinely suppressed news stories of interest to conservative readers from the social network’s influential “trending” news section, according to a former journalist who worked on the project. This individual says that workers prevented stories about the right-wing CPAC gathering, Mitt Romney, Rand Paul, and other conservative topics from appearing in the highly-influential section, even though they were organically trending among the site’s users.

And yet, when you actually read what the curator said, it seems far less damning:

“I’d come on shift and I’d discover that CPAC or Mitt Romney or Glenn Beck or popular conservative topics wouldn’t be trending because either the curator didn’t recognize the news topic or it was like they had a bias against Ted Cruz.”

Other curators interviewed for the piece denied “consciously suppressing conservative news” and were “unable to determine if left-wing news topics or sources were similarly suppressed.” Official sources within the company quickly came out against the story. Tom Stocky, the leader of the Trending Topics team, wrote that “Facebook does not allow or advise our reviewers to systematically discriminate against sources of any ideological origin and we’ve designed our tools to make that technically not feasible.”

Herein lies a tension. There is a massive gulf between active suppression and conservative blindness, and that ocean is one of malicious intent. Hiring young content curators will likely yield a leftward bent, as both age and the profession tilt that way. But blindness to certain kinds of stories isn’t the same as actively limiting stories from those right from being placed in the Trending Topics section. Conservative content isn’t being filtered out of the News Feed, which is still how the vast majority consume their Facebook content. Rather, certain stories didn’t getting a boost from whatever attention the Trending Topics receives.

So what unites all of the sources that were singled out in these reports? Well, for one, none of the outlets mentioned by name, including The Blaze, Drudge Report, and Breitbart, are particularly well known news institutions, as Pew found from surveying news consumption last year. Only the Drudge Report was able to muster anything more than 35 percent of name recognition among survey participants, which hardly compares to Fox News, the New York Times, or NBC News, which each top over 80 percent for various age groups.

Facebook is also facing an uphill battle for their own users on trust as well. The company clearly ranks behind all of the other major social media in their users trusting news content on the site. Even though users are wary of what they see on social media, the reputation of the originating news organization is critical for nearly 7 in 10 of those who get news on Facebook. If there is something here, then the underlying bias might not be based on institutional outlook, but an internal pressure to cite sources with name recognition.

The report also revealed that some stories were knowingly boosted in the rankings. As I explained before, Facebook has a different kind of network structure as compared to Twitter. Only about 22 percent of users pairs have a reciprocal relationship on Twitter, so the site is littered with power users that can quickly spread information. On the other hand, Facebook is built on one-to-one relationships that have slower rates of information diffusion. One study of the lag in breaking news found that Twitter takes about 2.36 hours to pick up an event compared to Facebook which takes nearly 10.

In the past, Facebook has been chided loudly for not boosting stories that are of social and political importance. In one widely shared piece, Zynep Tufecki lamented that Facebook didn’t have immediate coverage of Ferguson as it first happened in August of 2014:

This morning, though, my Facebook feed is also very heavily dominated by discussion of Ferguson. Many of those posts seem to have been written last night, but I didn’t see them then. Overnight, “edgerank” –or whatever Facebook’s filtering algorithm is called now — seems to have bubbled them up, probably as people engaged them more…

But I wonder: what if Ferguson had started to bubble, but there was no Twitter to catch on nationally? Would it ever make it through the algorithmic filtering on Facebook? Maybe, but with no transparency to the decisions, I cannot be sure.

The report confirms what network theory would predict. Important stories would sometimes fail to reach critical mass on Facebook quickly enough to be deemed “trending” by the algorithm. This isn’t necessarily a bug with the trending algorithm, but an outgrowth of the structure of the social network. However, facing criticism, it seems the company took a more proactive role in injecting important stories into the trending section.

All of this brings to mind a couple of comments and a few outstanding questions.

To begin, let’s not mince words. Regardless of one’s political beliefs, some sort of bias was bound to creep in since there is a concerted effort to determine the outcomes of the social networking site. This is exactly what the story highlights. Facebook has faced pressure for not breaking news in the past. Criticism abounded after the beginning of Ferguson, as explained earlier, even though information propagation is slower on Facebook compared to Twitter. The criticism is just one part of a much larger campaign to open up the Facebook box and to understand how the site is affecting news consumption. Of course, this research should continue, but in all of the doting over Facebook news content, I still haven’t seen an explication of better world. What should Facebook look like?

Second, it is still unclear what is the relationship between the Trending Topics module and the News Feed. How much does being within the trending topics actually drive conversation and recognition to a site? I cannot remember a single time when I clicked on a news article via the Trending Topics, and I don’t seem to be alone. In all of the discussion about political bias that has been floating around, I haven’t heard much about the decades of research in psychology and political science on these subjects, as I detailed extensively here. Countless studies have found that exposure to disparate viewpoints tends to make you more sure in your own. A related tendency is what is known as the hostile media perception. Partisans tend to judge media coverage as unfavorable to their own point of view. In other words, in the grand scheme of things, I have to wonder, how much does this matter?

Third and most importantly, what this story highlights is that we need more. We need more tools of information discovery. While countless people know about Google, Twitter, and Facebook, far less know about Nuzzel, Zite, Flipboard, and Nextdraft. We need more research on these tools, as I have called for in the past. Research that combines the best of econometric and psychological analysis of media consumption is seriously lacking. We need more media literacy. Educators, parents, and users all need to work on becoming more knowledgeable about the digital world. But most especially, we all need be more sane.