Why hyperpartisan news outlets still thrive on Facebook

Source (Creative Commons)

For most of 2012 and 2013, I ran the Facebook page for a national magazine, and it didn’t take long for me to realize that not only did the posts with the highest engagement get exposed to more users, but Facebook considered certain types of engagement more important than others. Most notably, it seemed clear that comments on posts were given much more weight in the Facebook algorithm than likes. This makes sense; it takes much more effort to type out a comment than it does to simply hit a ‘like’ icon as you’re scrolling through your feed.

As the person responsible for growing the magazine’s Facebook presence, I began to experiment with ways to generate more comments on each post. I quickly realized that the best way to do this was by placing an open-ended question at the top of political news stories. For instance, when linking to a story about Obamacare, I might ask the question “Do you think Obamacare is making healthcare more affordable for Americans?” This would immediately spark off a flame war as conservatives and liberals began to argue in the post’s comments section. I didn’t care about the quality of their comments or whether there were any meaningful attempts to actually answer my question; the important thing was that they were commenting, thus signaling to Facebook that the post was “important” and worth showing to more of our followers.

I still think about this experience often, especially when I read about Facebook’s ongoing efforts to improve the quality of news shared on its platform.

These efforts started almost immediately after the conclusion of the 2016 election. Horrified by reports of how easily Facebook’s algorithm had been manipulated by purveyors of false news, the company embarked on a number of initiatives aimed at cleaning up its users’ Newsfeeds.

It formed a number of fact-checking partnerships with well-respected news organizations, paying them to comb through articles with contentious claims and flag the ones that were peddling fake news. Once a story was demarcated as false, a pop-up warning would appear to any user who then tried to click on or share that story.

And then in January 2018 Mark Zuckerberg made a series of announcements that sent seismic shocks throughout the media industry. First, he revealed that Facebook would show less content from pages in the Newsfeed, favoring instead content posted by friends and family. Then, a few days later, Zuckerberg published another post, this one elaborating on what these changes meant for news.

After first predicting that the amount of news in the Newsfeed would drop from 5 percent of posts to 4 percent, Zuckerberg shared the company’s goal “to make sure the news you see, while less overall, is high quality.” What did this mean? “I’ve asked our product teams to make sure we prioritize news that is trustworthy, informative, and local,” he wrote. “There’s too much sensationalism, misinformation, and polarization in the world today.”

In other words, Zuckerberg didn’t just want to eliminate false news on Facebook, he wanted to improve the overall quality of the news we’re exposed to by showing us posts that were more objective and less hyperpartisan.

To accomplish this, Facebook launched a number of initiatives. It prompted users to take surveys that measured media outlets’ trustworthiness. “We surveyed a diverse and representative sample of people using Facebook across the US to gauge their familiarity with, and trust in, various different sources of news,” wrote Adam Mosseri, Head of News Feed. “Publications deemed trustworthy by people using Facebook may see an increase in their distribution. Publications that do not score highly as trusted by the community may see a decrease.”

Facebook also gave a boost to local news by rolling out a feature called Today In. After opting into the service, a user would be exposed to local stories produced by local outlets, even in cases where that user didn’t actually subscribe to the outlet’s Facebook page. “Our goal is to show more information that connects people and publishers to their local communities,” Facebook wrote of the initiative.

Most of the above-mentioned features debuted by March of 2018. So how effective have they been over the last year or so?

To be clear, there are genuine signs of improvement to the Newsfeed. Over 1.1 million users have opted in to Today In, and the founder of a network of 81 local news sites recently told me that his sites saw measurable lift from the program. There’s also significantly less false news shared on Facebook. A study conducted by researchers at University of Michigan, Princeton University, University of Exeter, and Washington University at St. Louis discovered that the proportion of Americans who visited sites that published false news in the runup to the 2018 election dropped by 75 percent, compared to visits during the 2016 election. What’s more, those same researchers found that these sites’ referrals from Facebook dropped substantially.

That being said, plenty of false news still proliferates on Facebook. In February, Snopes chose not to renew its fact-checking partnership, citing a lack of bandwidth to deal with the constant flood of fake news stories to check. Other partners found their job to be a sisyphean task, telling the BBC that they “feel underutilised, uninformed, and often ineffective.”

But Facebook’s greatest struggle has been its attempt at elevating objective, quality outlets over their hyperpartisan counterparts. As The Columbia Journalism recently put it, “high-quality news sources are getting less engagement on Facebook and lower quality sites are getting a lot more.”

Highly biased news sites, particularly of the right wing variety, are consistently generating the most engagement on Facebook. Check the analytics data on Crowdtangle on any given day and you’ll find that the majority of the most-engaged news stories originated on right-wing sites like Daily Wire and Fox News. In March of 2019, The Daily Mail, a newspaper that’s of such low quality that a majority of Wikipedia’s top editors voted to no longer consider it a reliable source for the encyclopedia, took the top spot, followed by Fox News. Of the top 25 most engaged pages that month, at least four represented news outlets that were explicitly conservative, including Breitbart, The Daily Wire, and Western Journal. Only one explicitly liberal site, HuffPost, made the list.

If anything, the above stats downplay how much hyperpartisan, right wing news thrives on Facebook. “Fox News accounts for 438 of the top 10,000 English-language stories on Facebook through early March 2019, more than any other publisher,” wrote The Wrap’s Sean Burch, citing data from Newswhip. “The Daily Wire, conservative commentator Ben Shapiro’s digital publication … [took] second place with 347 of the top 10,000 stories.” Think about that. A far-right opinion site with a small staff and very little original reporting had more highly-engaged stories than major media outlets that publish huge investigations and employ Pulitzer-winning reporters.

And Facebook’s effort to boost local news? That, too, ran into some snags. In March, Facebook admitted that one in three of its users lived in a “news desert,” meaning that their local region didn’t produce enough local news to be featured in Today In. Further digging from Nieman Lab’s Christine Schmidt found that, even in regions that produced enough local news to be featured in Today In, the news being shared on Facebook wasn’t of particularly high quality.

So why is Facebook, a company with billions of dollars and Silicon Valley’s smartest engineers, struggling in its efforts to promote quality news?

Well, any algorithm that’s predicated on promoting content that generates the highest engagement is going to reward hyperpartisan content. As I’d discovered back in 2012, nothing draws out Facebook comments better than tribal politics. Recently, Vice published a feature on how effective Fox News has been at dominating other, more objective news sites on Facebook. Fox did this by exploiting partisan rancor. “We would intentionally post content that would be divisive and elicit a lot of comments,” a former Fox staffer told Vice’s David Uberti. A Facebook spokesperson essentially confirmed to Uberti that this approach accounted for Fox’s success. “If you’re going to constantly engage with Fox, commenting on their posts or sharing their content, clearly that’s a signal for us,” they said.

As for why conservative sites seem to be so much more successful than their liberal counterparts, this might be a result of Facebook’s changing demographics. A recent report from eMarketer found that older Americans, aged 55 and up, are the fastest growing age group on Facebook. This is a group that tends to be more conservative. Meanwhile, younger users, who tend to be more liberal, are abandoning Facebook in droves, flocking to other platforms like Instagram, Snapchat, and TikTok. In 2018, only 51 percent of teens said they used Facebook, down from 71 percent in 2015.

And what kind of content do older users like to share on Facebook? Nicole Hickman James, described by BuzzFeed News as someone who “spent years working for a publisher that ran both liberal and conservative hyperpartisan Facebook pages and associated websites,” told BuzzFeed that “she tailored her articles to older readers because they were the most engaged audience.”

Older users aren’t just more engaged; they’re also more likely to share false news. A study conducted by Princeton University political scientist Andrew Guess and other researchers found that “the oldest Americans, especially those over 65, were more likely to share fake news with their Facebook friends.” The gap between older and younger users on this front was significant. “On average, users over 65 shared nearly seven times as many articles from fake-news domains as the youngest age group.” And of those who did share fake news on Facebook, they were much more likely to be Republican than Democrat (“18.1% of Republicans versus 3.5% of Democrats in our sample”).

But what about Facebook’s user surveys meant to gauge news brand trustworthiness? Several polls have shown that Americans ascribe a high level of trust to partisan news brands like Fox News and Sinclair Broadcast Group, and millions of those same Americans have been conditioned to believe that mainstream outlets like The Washington Post and The New York Times (aka the “fake news media”) aren’t to be trusted. It’s unlikely that such a subjective user survey would accurately measure a news outlet’s quality.

At this point you might be thinking: so what? Just because a Facebook page is receiving more visibility and engagement on Facebook doesn’t mean that people are actually clicking on those links and reading those articles. The New York Times could still be drawing far more Facebook traffic than The Daily Wire.

But research from political scientist Nick Anspach found that whether someone clicks on a link hardly matters, at least when it pertains to information retention. His study found that, even when a person doesn’t click on and read an article, they still maintain a high level of recall of the information shared in both the headline and status update of a Facebook post.

As for Facebook’s struggles to promote local news, that one’s not too hard to explain. Since 2004, roughly 1,800 local newspapers have closed their doors. Hedge funds have been buying chains of local newspapers, conducting layoffs, squeezing them for profits, and then shutting them down once they no longer bore fruit. And local business advertising, the lifeblood of newspapers, has been siphoned away to places like Google, Craigslist, and, yes, Facebook.

So what’s to be done in this situation? To Facebook’s credit, the company hasn’t given up on its mission. In fact, it seems to finally recognize that these aren’t problems that can be fixed by a simple tweak to the algorithm, and if it wants to promote quality news it needs to roll up its sleeves and get its hands dirty.

Recently, Zuckerberg revealed that Facebook was considering launching a dedicated news tab that would be curated by actual humans, an action that many considered unthinkable ever since the company fired its trending news team after it was accused of liberal bias. Facebook has also pledged $300 million to support local news programs,

And then this month, after years spent applying a laissez faire attitude to extremist content, Facebook announced a ban on several news outlets and figures — including Alex Jones, Infowars, Milo Yiannopoulos, Paul Joseph Watson, Laura Loomer, Paul Nehlen, and Louis Farrakhan — that regularly post hateful content. This came not long after the company rolled out updates that would punish groups that “repeatedly share misinformation.”

Facebook hasn’t completely given up on algorithmic fixes. It made another announcement recently that, when ranking content in the Newsfeed, it would consider how often a news source is linked to outside of Facebook. Many have noted that this is how Google’s search engine establishes a website’s authority, and it can help Facebook weed out low-quality news sites that have grown exceptionally good at gaming the platform’s algorithm.

This is all a work of progress, of course, but it doesn’t take a stretch of the imagination to realize that Facebook executives are eyeing the 2020 election as the next great test of the platform’s ability to police misleading, divisive content. The stakes, as they say, are high. When you command a userbase of 2.2 billion human beings, your responsibility to uphold and support democratic institutions is substantial, and your ability to undermine them is equally so.

Simon Owens is a tech and media journalist living in Washington, DC. Follow him on Twitter, Facebook, or LinkedIn. Email him at simonowens@gmail.com. For a full bio, go here.