Facebook’s Failure to Enforce Its Own Rules

Pages are fabricating metrics and overriding bans

Jonathan Albright
Nov 6, 2018 · 15 min read
Photo: Dan Kitwood/Getty Images

This is the last installment of The Micro-Propaganda Machine, a three-part analysis critically examining the issues at the interface of platforms, propaganda, and politics.

The third part of my analysis of Facebook prior to the midterm election, looks at granular enforcement and Facebook’s challenges in enforcing its community standards and terms of service. This post highlights the long-term gaming of the platform’s engagement numbers and interaction metrics by several recently removed pages and presents a case of the company’s failure to identify and remove content from InfoWars, a removed—or “banned”—presence on the platform.

At first glance, Facebook’s efforts to identify “inauthentic” accounts, find and ban actors who have violated its terms of service and platform rules, and flag “false news” might appear to be moderately successful. Through my investigation of the platform, however, there appears to be a longstanding pattern of ineffective rules paired with inconsistent enforcement. This has opened up many loopholes and workarounds for certain pages and actors and facilitated the misuse and exploitation of Facebook’s platform.

A number of high-profile and previously unreleased data findings shine a light on Facebook’s granular enforcement problem, which seems to be expanding at a rate that is outpacing the company’s ability to dedicate enough resources to contain it.

While recent platform integrity initiatives at Facebook have worked in some ways, the most important functions—including the proactive removal of inauthentic and banned accounts—does not appear to be working nearly as well as it should.

To the company’s credit, there have been a spate of publicly announced removals and takedowns. Mark Zuckerberg, the company’s CEO and arguably the final decision-maker, finally recognized the long-term problem, going so far as to suggest that the company needs help from researchers and journalists. Facebook has been at work identifying and removing inauthentic accounts and spammy pages that have gamed its system and/or violated its terms of service. One of the most publicized instances of this was the company’s October 11, 2018 removal of nearly 800 U.S.-based pages and accounts.

For some of the most prolific of the removed pages, “censorship” or the fact they might have posted inaccurate and misleading articles and links were not the reason for their abrupt removal.

Enlightening historical data on a few of the most prolific of the removed pages provides strong evidence to suggest that many of the pages removed in October were not taken down because of the nature of the content posted on their pages or inaccuracies in the claims made in their stories or the large number of links they shared. Rather, they appear to have long been exploiting the platform’s measurement features and post-engagement metrics—behaviors that seem to have resulted in the inflation of the numbers of likes, shares, and video views reported by these pages.

These activities appear to have taken place as far back as 2010 and as recently as the middle of October this year. Based on the scale of the apparent metrics gaming effort, which Facebook referred to as “spam,” and the approaching midterm elections, it’s a surprise these pages weren’t removed until October.

The first set of data below shows nearly a decade of historic Facebook page analytics. This includes some of the pages removed by Facebook in its domestic political spam sweep in October. The metrics include pages for the Western Journal and Conservative Tribune, two popular pages that were used at the time of the query as points of reference. I’m not suggesting these pages were involved in engagement metric gaming. From what I can tell, neither appears to have abused the platform’s engagement metrics.

When Facebook took down all the pages in October, the reasons for the takedown were not made clear. This resulted in immediate claims of censorship from the founders of and the contributors to these pages. These claims included reasons such as the pages’ conservative political stances or because Facebook wanted to stop the kind of “information” getting published and shared.

After looking into the historical page data—nearly a decade’s worth of analytics—I collected back in September and again at the end of October, I’d argue that for some of the most prolific of the removed pages, “censorship” or the fact they might have posted inaccurate and misleading articles and links were not the reason for their abrupt removal.

The likely reason several of the largest of these mostly conservative and political fringe news pages were taken down is because they appear to have been gaming the platform’s metrics for years. The pages put up Facebook interaction numbers in the billions, and many of the videos consistently showed engagement in the tens of millions of views. I found that at least three of the pages reported near-astronomical engagement numbers over the past five years. These are the kind of numbers that would be difficult to justify in almost any scenario—even in the case of a very large and sustained advertising spend on Facebook.

The kind of numbers posted by these pages frequently involved the sharing of nondescript “statement memes” and other seemingly mundane content, including short videos about ideological values. But when I went to compare them to the largest news pages on Facebook, their numbers were right up there with the highest engagement Fox News’ Facebook page has ever seen.

The first of the pages removed on October 11 that drew my attention in the data was Right Wing News.

In terms of historical engagement on Facebook, Right Wing News, a page representing a small news outlet line run by a political blogger, basically had no peers in the news industry on Facebook until the final stretch of the 2016 election.

The two red spikes shown to the right of the graph below are the interactions of Fox News’ page. The two spikes were the result of the network’s coverage of the 2016 election results and the presidential inauguration that followed in January. This means that the kind of numbers posted by Right Wing News two years earlier, at a time the platform wasn’t as large (see the series of green peaks to the left of graph) aren’t just good—they’re unbelievable.

The above graph also shows that for all its liberal-hating fame, Breitbart’s Facebook page has 250 million more interactions to go before it even gets close to the kind of numbers that Right Wing News page had before it was taken down.

From the time of Obama’s 2012 re-election well into the 2016 election of Donald Trump, the pages of up-and-coming, arguably more popular right-wing news media—including Breitbart, Daily Caller, and the Western Journal—did not have anywhere near the kind of engagement that Right Wing News received on Facebook.

Between 2013 and 2016, Right Wing News and its affiliate pages consistently saw more interaction on Facebook than the pages of many of the largest news media in the country: the New York Times, the Washington Post, and Fox News. Combined.

Right Wing News’ post, shown below, pulled in close to 23 million views—with just 5,000 shares. I watched it, and it’s more or less what you’d expect from any given lesson-type video clip found on Facebook.

Week after week, month after month, the recently removed pages of Right Wing News, Daily Vine, and Silence Is Consent posted engagement numbers—to borrow a quote from the epic sci-fi novel Dune—“the likes of which God has never seen.”

In the graph below, you can see Right Wing News had close to 700 million interactions on Facebook between October 2, 2013, and September 13, 2018. The better part of a billion interactions. This actually begs the question: Were these numbers even possible back then? Perhaps, but I’d argue it’s probably not because humans were the only ones sharing the posts and watching the videos.

At the end of the 2016 election, engagements on the Right Wing News page appear to have taken a steep decline. One explanation for this might be due to Facebook’s post-election scrutiny. Or, perhaps, Facebook simply stopped counting the engagements for some of the loopholes that might have been exploited starting in late 2013 (see the left of graph below).

This makes these historical numbers even more remarkable. If you were to zoom in on the period between 2013 and 2016, Right Wing News is so far ahead of other political news pages in terms of its engagements—mostly likes and shares—that it was making virtual laps around them, including much larger conservative news outlets like Fox News.

Right Wing News, a page that had 3,170,372 followers before it was removed by Facebook, was just one of almost 600 domestic political news pages removed in October for reasons that included “inauthentic behavior” and “spam.” Coincidentally, their removal came at the same time there was a renewed focus—and new lawsuit filed—on allegations Facebook had misreported measurements about video engagements on the platform.

Inflating engagements, such as likes and shares, is one thing. But another finding from my September analysis involving some of the worst offenders in the cohort of removed domestic political pages involved suspiciously high video views. Looking at the historical video views, there might have been another metric gaming problem by some of these pages. What the Right Wing News page was for high interaction numbers, the now-removed Silence Is Consent and Daily Vine pages were for video views.

The video view count numbers from Daily Vine and Silence Is Consent shown in the above graph—and whose December 2015 post (see below) reported more than 53 million views and 1 million shares before the page was taken down—seem high to say the least.

A significant number of the videos posted on the recently removed Silence Is Consent’s page reported zero post views at the time the analytics were pulled in September. I’m not sure if this was simply a glitch or if Facebook was already in the process of removing the page from its measurement and engagement database.

The historical engagement data for the removed pages of OfficialRightWingNews, retainyourfreedom (Silence is Consent), and DailyVineNow can be downloaded here.

For the second example of granular enforcement, I focused on the removed pages of Alex Jones and the Alex Jones Show on InfoWars daily broadcast. Following the highly publicized ban in early August, Jones’ show and much of the removed InfoWars news content appears to have moved swiftly back onto the Facebook platform.

Here’s the deal: I was not tracking the InfoWars accounts that were inevitably going to reappear after the official accounts were banned on Facebook. In fact, when I encountered the Alex Jones’ livestream, I wasn’t looking for InfoWars. I was looking for Soros conspiracies.

And what did I get? The live high-definition stream of Jones’ show on Facebook—broadcast on one of the many InfoWars-branded pages that are inconspicuously named “News Wars.”

Alex Jones’ program found me. To add more context, a couple weeks earlier, I was looking for posts on Facebook related to the Soros-funded caravan rumor. For one of my searches, Jones’ live-stream titled “A New Caravan of Invaders” was one of the top 20 results returned on Facebook.

This unfortunate stroke of luck showed that Jones’ show has been broadcast nearly every day for the past three months on at least two Infowars-branded Facebook pages.

So, the Facebook “ban” doesn’t seem to have worked.

News Wars, and a page called “Infowars Stream,” were being promoted by Facebook via its search and video recommendation algorithms for searches about conspiracies and politics, such as my query for “Soros caravan.”

Since the first day of August—the same week Jones’ and the largest of the InfoWars pages were taken down—Jones’ InfoWars broadcasts (primarily the streams of Alex Jones’ daily “censored” talk show on InfoWars) have been viewed at least five million times. And over the same time period, these two pages, with less than 30,000 followers combined, have reported almost 700,000 interactions.

These kind of numbers are not that far off from what the combined blue-checkmarked Jones and Infowars pages were getting in the three months before they were removed.

Given that Jones’ official (and removed) Facebook page had 1.64 million followers, and the removed official InfoWars page had 916,000 followers at the time of removal, these kind of engagement numbers from two relatively unknown pages—with barely 30,000 followers between them—doesn’t seem that far off from what the blue-checkmarked Jones and Infowars pages reported over the last three months before they were removed by Facebook in August.

While they aren’t the kind of sizeable views that Jones’ and InfoWars pages had on Facebook at their peak more than a year before the ban took place, it still looks like Jones and company are back in business on Facebook.

According to InfoWars Stream and News Wars admins, however, Facebook knows all about this. In fact, on the News Wars and Infowars Stream page posts, the phrase that’s frequently used for when the sentinels come for Jones and disrupt his live high-definition video feeds (which all appear to be available for replay on these new pages) is “getting Zucc’d.”

And, it turns out, this is just one part of the Jones versus Zuckerberg Facebook video saga. As of October 21, Infowars.com appears to be using Facebook’s video hosting and embed capabilities to serve video content on its own website.

This is even more odd because Jones’ feeds are being promoted on Facebook as a sort of underground pirate radio broadcast, when in actuality, InfoWars appears to be using Facebook’s platform for its free video-hosting capabilities, mirroring this content across Facebook to Infowars’ own website through streaming pages conspicuously connected to the banned InfoWars ones—pages to which Jones may or may not be legally connected.

“Get Zucc’d”? Are they not aware that Facebook staff can just look at the videos on these public pages—videos now getting millions of views per month (as of November 2)?

Jones’ videos on the Infowars Stream page shows daily engagement rates of up to 74 percent. Those kind of numbers seem pretty good, don’t you think?

These kind of shenanigans make Jones even more of a legend in the world of alternative “anti-globalist cabal” fringe media. But unlike the other actors who have stepped out of line in their conspiracy-mongering, on Facebook, Jones has gone right back onto the platform for its free offerings, including the platform’s industry leading live video streaming capabilities. And, apparently, all while growing his audience.

This InfoWars “censorship” case is clearly a gross enforcement failure by Facebook. Where were the people working in Facebook’s highly publicized “Election War Room”? What were they doing while these InfoWars shadow pages started sharing live InfoWars broadcasts?

Where were they when these pages posted hot-button election-themed conspiracy memes?

Millions of dollars of technology, training time, and human resources were supposedly spent by Facebook to better monitor, catch, and remove things exactly like this.

The videos are now landing at the top of searches on Facebook for election-related news and conspiracy topics.

Image for post
Image for post
Charts show posting activity trends on “alt InfoWars” (with a focus on “News Wars”) Facebook Pages

Transparency. Truth. Credibility. These aren’t just journalistic buzzwords. They are part of a multilateral, cross-continental call to arms in the war to “save democracy,” ensure election integrity, and preserve American media institutions. Curbing the rise of disinformation, politically motivated hate speech and violence—and stopping the bots and other coordinated propaganda—has tended to revolve around three criteria: 1) the accuracy of the claims, 2) the authenticity of the sources, and 3) the appeal of the messages.

Finding truth in the surrounding contexts of shared information often involves less quantifiable methods—especially on platforms like Facebook. We need more common sense approaches to monitoring bad behaviors and manipulation efforts. We should always take into account the loopholes being used by these bad actors, however simple they might seem to be. We need to consider the conditions of the information’s reception by its target audiences. If conspiracy videos are seen as censored and part of a pirate radio operation by a celebrity banned actor, it’s likely to make them even more popular.

Granular enforcement isn’t just reactive takedowns. It’s about proactive measures. It involves considering all the factors that play into how things like banned InfoWars live-streams get further propagated.

Without granular enforcement, the non-foreign bad actors will only get better and refine their tactics to increase exposure to these kinds of messages.

From what I’ve seen in my extensive look into Facebook’s platform, especially in regards to the company’s capacity to deal with the misuse of its platform, I argue that common sense approaches to platform integrity and manipulation still appear to be less of a priority for Facebook than automated detection and removal publicity.

Shady foreign adversaries present less of a liability to the long-term integrity of the information sphere than the sort of ongoing cases my research has uncovered. The infinite gray area of information-sharing poses the real challenge: It’s the slippery soft conspiracy questions, the repetition of messages seen on shocking memes and statements, and the emotional clickbait that’s regularly seen on Jones’ InfoWars video stills. Without granular enforcement, the non-foreign bad actors will only get better and refine their tactics to increase exposure to these kinds of messages.

Information integrity is more than the scrutiny of provable statements or the linking of data to shared content with an “i.” Transparency involves more than verifying a page administrator, putting the name alongside a date and voluntary disclosure for a paid ad campaign, and adding this data to a political ad archive.

The real-world efficacy of information integrity initiatives—especially the ones launched by Facebook—are predicated on getting help from the public in order to succeed. This help is conditional on two things: being able to find the problems on the platform and being able to trust the reported information, metrics, and measurement data that are provided.

This is where the notion of accountability is more useful than transparency. Accountability requires transparency. It’s not just about showing us some of the content, posts, and ads on Facebook. It’s about enabling access to this content and about being sure that the information we’re getting from Facebook is accurate and the techniques used to collect it are reliable. The seeds that disinformation, conspiracies, propaganda bloom from are increasingly activities that succeed not because of ads, but because of inauthentic participation hidden inside opaque products like Facebook’s groups. Accountability necessitates access and timely, accurate reporting. If we can’t find the problems or trust the data we’re getting, then we can’t begin to understand what’s really going on—or help fix it.

Written by

Professor and researcher in news, journalism, and #hashtags. Award-nominated data journalist. Media, communication, and technology.

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store