The 2018 Facebook Midterms, Part III: Granular Enforcement
⏏️Part III: Granular Enforcement
The third part of my midterm election-time study looks at granular enforcement and focuses on Facebook’s challenges in enforcing its community standards and terms of service. This post 1) highlights a case of the long-term gaming of the platform’s engagement numbers and interaction metrics by several recently removed Pages and 2) presents a case of the company’s failure to identify and remove content from InfoWars, a removed, or “banned,” presence on the platform.
At first glance, Facebook’s efforts to identify “inauthentic” accounts, find and ban actors that have violated its terms of service and platform rules, and flag “false news” might appear to be moderately successful. Through my investigation of the platform, however, there appears to be a longstanding pattern of ineffective rules paired with inconsistent enforcement. This has opened up many of the loopholes and workarounds for the Pages and actors shown in this post, and facilitated the misuse and exploitation of Facebook’s platform.
Like the earlier posts, I use a number of high-profile and previously unreleased data findings to shine light on Facebook’s granular enforcement problem, which seems to be expanding at a rate that is outpacing the company’s ability to dedicate enough resources — including technological capabilities as well as skilled talent — to contain it.
While recent platform integrity initiatives at Facebook have worked in some ways, the most important functions— including the proactive removal of inauthentic and “banned” accounts — does not appear to be working nearly as well as it should. To the company’s credit, there have been a spate of publicly announced removals and takedowns. Mark Zuckerberg, the company’s CEO and arguably the final decision-maker — may have finally come to terms with the long-term problem, going so far as to suggest that the company needs “help” from researchers and journalists.
My main focus here is on domestic election influence, and the organized manipulation of political news and election-time narratives, since I feel it’s an area that’s been overlooked over the past couple years. As I’ve mentioned, Facebook has recently been at work identifying and removing inauthentic accounts and spammy Pages that have gamed its system and/or have violated its terms of service. One of the most publicized instances of this was the company’s October 11, 2018 removal of nearly 800 U.S.-based Pages and accounts.
We're removing 559 Pages and 251 accounts that have consistently broken our rules against spam and coordinated…newsroom.fb.com
I share some enlightening historical data on a few of the most prolific of these removed Pages below. Most of this data is in the form of analytics. It’s substantial, and it provides strong evidence to suggest that many of the Pages removed last month were not taken down because of the nature of the content posted on their pages, or inaccuracies in the claims made in their stories, or the large number of links they shared — but rather because they appear to have long been exploiting the platform’s measurement features and post engagement metrics—behaviors that might have resulted in the inflation of the numbers of likes, shares, and video views reported by these Pages.
These activities appear to have taken place as far back as 2010 and as recently as the middle of October this year. Based on the scale of the apparent metrics gaming effort, which Facebook referred to as “spam,” and the approaching midterm elections, it’s a surprise these Pages weren’t removed by the company until three weeks ago.
The first set of data I present below shows nearly a decade of historic Facebook Page analytics. This includes some of the Pages removed by Facebook in its domestic political “spam” sweep in October. The metrics include Pages for the Western Journal and Conservative Tribune, two popular Pages that were used at the time of the query as points of reference. I’m not suggesting these Pages were involved in the kinds of activities as the ones I use in the following examples of engagement metric “ gaming.” From what I can tell, neither appears to have abused the platform’s engagement metrics.
Removing “Additional” Inauthentic Activity
When Facebook took down 559 Pages in October, the reasons for why each Page was removed were not made clear. This, of course, resulted in a cause célèbre for the founders and the contributors to these Pages, which immediately led to claims of “censorship.” These claims included reasons such as the Pages' conservative political stances, or because Facebook wanted to stop the kind of “information” that was getting published and shared by these Pages on its platform.
After looking into the historical Page data I had collected back in September a second time at the end of October — nearly a decade’s worth of analytics for several of these Pages (shown below), I’d argue that for the some of the most prolific of the removed Pages, “censorship,” or the fact they might have posted inaccurate and/or misleading news articles and links at some point in the past were not the reason for their abrupt removal in October.
The likely reason why several of the largest of these mostly conservative and politically fringe news Pages were taken down is because they appear to have been gaming the platform’s metrics for years. The pages I show in the following section had put up Facebook interaction numbers in the billions. Similarly, many of the videos consistently showed engagement in the tens of millions of views.
I found that at least three of the Pages — removed less than a month ago — reported near-astronomical engagement numbers over the past five years. These are the kind of numbers that would be difficult to justify in almost any scenario — even in the case of a very large and sustained advertising spend on Facebook. Around the time of the collected data, none of the Pages I will discuss appeared to be running advertisements on the platform.
The kind of numbers posted by these Pages frequently involved the sharing of nondescript “statement memes,” and other seemingly mundane content, including short videos about ideological values. But when I went to compare them to the largest news Pages on Facebook, their numbers were right up there with the highest engagement Fox News’ Facebook Page has ever seen.
📈Right Wing News
The first of the Pages removed on October 11 that drew my attention as I looked back through the data I collected in September 2018 was “Right Wing News.”
In terms of historical engagement on Facebook, Right Wing News, a Page representing a small news outlet line run by a “political blogger,” basically had no peers in the news industry on Facebook until the final stretch of the 2016 election.
The two red spikes shown to the right of the graph (see below) are the interactions of Fox News’ Page. The two spikes were the result of the network’s coverage of the 2016 election result and the presidential inauguration that followed in January. This means that the kind of numbers posted by Right Wing News starting two years earlier, at a time the platform wasn’t as large (see the series of green peaks to the left of graph) aren’t just good, they’re astronomical.
The above graph data also shows that for all its liberal-hating fame, Breitbart’s Facebook Page has 250 million more interactions to go before it even gets close to the kind of numbers that Right Wing News Page had put up before the Page was taken down.
From the time of Obama’s 2012 re-election well into the 2016 election of Donald Trump, the Pages of up-and-coming, arguably more popular right-wing news media — including Breitbart, Daily Caller, and the Western Journal — did not have anywhere near the kind of engagement that Right Wing News received on Facebook.
Between 2013 and 2016, Right Wing News and its affiliate Pages consistently saw more interaction on Facebook than the Pages of many of the largest news media in the country — including properties such as the New York Times, The Washington Post, and Fox News. Combined.
Right Wing News’ post, shown below, pulled in close to 23 million views — with just 5,000 shares. I watched it, and it’s more or less what you’d expect from any given lesson-type video clip that is found on Facebook.
Week after week, month after month, the recently removed Pages Right Wing News, along with the Pages for Daily Vine and Silence is Consent posted engagement numbers — to borrow a quote from the epic sci-fi novel Dune “the likes of which God has never seen.”
In the graph below, you can see Right Wing News had close to 700 million interactions on Facebook between October 2, 2013 and September 13, 2018. The better part of a billion interactions.
This actually begs the question: Are these kind of numbers even possible back then? Perhaps, but I’d argue it’s probably not because humans were the only ones sharing the posts and watching the videos.
At the end of the 2016 election, engagements on the Right Wing News Page appear to have taken a steep decline. One explanation for this might be due to Facebook’s post-election scrutiny. Or, perhaps Facebook simply stopped counting the engagements for some of the loopholes that might have been exploited starting in late 2013 (see the left of graph below).
This makes these historical numbers even more remarkable: If you were to zoom in on the period between 2013 and 2016, Right Wing News is so far ahead of other political news Pages in terms of its engagements — mostly likes and shares — that it was making virtual laps around them, including much larger conservative news outlets on the platform such as Fox News.
Historical Engagement (see Right Wing News)
Right Wing News, a Page which had 3,170,372 followers before it was removed by Facebook, was just one of almost 600 domestic political news Pages removed less than thirty days ago for reasons that included “inauthentic behavior” and “spam.” Coincidentally, their removal in October came at the same time there had been a renewed focus on — and new lawsuit filed — allegations the company had misreported measurements about video engagements on the platform.
Less than a month before the 2018 midterm elections, when the Page was removed, Right Wing News had reported more engagement on Facebook over the past five years than the New York Times, The Washington Post, and Breitbart…combined.
🕹Gaming Video Views
Inflating engagements, such as likes and shares is one thing. But another finding from my September analysis involving some of the worst offenders in the cohort of removed domestic political Pages involved suspiciously high video views. Looking at the historical video views below, there might have been another metric gaming problem by some of these Pages.
What the Right Wing News Page was for high interaction numbers, the now-removed Silence is Consent and Daily Vine Pages were for video views.
The video view count numbers from Daily Vine and Silence is Consent, shown in the above graph and whose December 2015 post (see below) reported more than 53 million views and 1 million shares before the Page was taken down last month, seem high to say the least.
A significant number of the videos posted on the recently removed Silence is Consent’s page, such as the one shown directly below, reported zero (0) post views at the time the analytics were pulled in in September. I’m not sure if this was simply a glitch, or if Facebook was already in the process of removing the Page from its measurement and engagement database.
🛑Enforcing and Maintaining Bans
Moving on to the second example in this granular enforcement post, I focus on the removed Pages of Alex Jones and the Alex Jones Show on InfoWars daily broadcast. Following the highly publicized “ban” in early August, Jones’ show and much of the removed InfoWars news content appears to have moved swiftly back onto the Facebook platform.
Here’s the deal: I was not tracking the InfoWars accounts that were inevitably going to reappear after the official accounts were banned on Facebook. In fact, when I encountered the Alex Jones’ livestream shown in the image below, I wasn’t looking for InfoWars. I was looking for Soros conspiracies.
And what did I get? The live high-definition stream of Jones’ show on Facebook — broadcast on one of the many InfoWars-branded Pages that is inconspicuously named “News Wars.”
Alex Jones’ program found me. To add more context, a couple weeks ago, I was looking for posts on Facebook related to the Soros-funded “caravan” rumor. For one of my searches, Jones’ live stream above, titled “A New Caravan of Invaders,” was one of the top twenty results returned on Facebook from the search.
What this unfortunate stroke of luck meant was that I found out Jones’ show has been broadcast nearly every day for the past three months on at least two Infowars-branded Facebook Pages. Nice ban.
News Wars, and a Page called “Infowars Stream” were being promoted by Facebook via its search and video recommendation algorithms for searches about conspiracies and politics — such as my query for “Soros caravan.”
Since the first day of August — the same week Jones’ and the largest of the InfoWars Pages were taken down — Jones’ InfoWars broadcasts — primarily the streams of Alex Jones’ daily “censored” talk show on InfoWars — have been viewed at least five million times. And over the same time period, these two Pages, with less than 30,000 followers combined, have reported almost 700,000 interactions.
These kind of numbers are not that far off from what the combined blue check-marked Jones and Infowars Pages were getting in the three months before they were removed:
Given that Jones’ official (and removed) Facebook Page had 1.64 million followers, and the removed official InfoWars Page had 916,000 followers at the time of their removal, these kind of engagement numbers from two relatively unknown Pages — with barely 30,000 followers between them — doesn’t seem that far off from what the blue check-marked Jones and Infowars Pages reported over the last three months before they were removed by Facebook in August.
While they aren’t the kind of crazy views that Jones’ and InfoWars Pages’ had on Facebook at their peak, more than a year before the ban took place, I’d say it means that Jones and company are back in business on Facebook.
According the InfoWars Stream and News Wars admins, however — the people getting Jones’ “censored” broadcasts out to the rest of America — Facebook knows all about this plan. In fact, on the News Wars and Infowars Stream Page posts, the phrase that’s frequently used for when the Sentinels come for Jones — and disrupt his live high-definition video feeds (which all appear to be available for replay on these new Pages), is: “getting Zucc’d.”
And it turns out, this is just *one* part of the Jones vs. Zuckerberg Facebook video saga: As of October 21, infowars.com appears to be using Facebook’s video hosting and embed capabilities to serve video content on its own website:
This is even more odd, since Jones’ feeds are being promoted on Facebook as a sort of “underground” “pirate radio” broadcast, when in actuality InfoWars appears to be using Facebook’s platform for its free video hosting capabilities, also mirroring this content across Facebook to Infowars’ own website through “streaming” Pages that are conspicuously connected to the “banned” InfoWars ones— Pages to which Jones may or may not be legally connected.
“Get Zucc’d”? Are they not aware that Facebook staff can just look at the videos on these public Pages — videos that are now getting millions of views per month (as of November 2)?
Jones’ videos on the Infowars Stream Page shows daily engagement rates of up to 74%. Those kind of numbers seem pretty good, don’t you think?
I have to admit, these kind of shenanigans make Jones even more of a legend in the world of alternative “anti-globalist cabal” fringe media. But unlike the other actors who have stepped out of line in their conspiracy-mongering, on Facebook, Jones has gone right back onto the platform for its free offerings, including the platform’s industry leading live video streaming capabilities. And, apparently, all while growing his audience:
I don’t really view Jones as the responsible party here. Instead, I see this InfoWars “censorship” case as a gross enforcement failure by Facebook.
Where were the people working in Facebook’s highly publicized “Election War Room?” What were they doing while these InfoWars shadow Pages started sharing live InfoWars broadcasts with headlines like the one seen below — a video that’s had more than 576,000 views?
Or when these Pages posted hot-button election-themed conspiracy memes like the News Wars Facebook video stills shown below?
Millions of dollars of technology, training time, and human resources were supposedly spent by Facebook to better monitor, catch, and remove…exactly things like this.
Jones was famously banned from the Facebook platform in August, in a move by the so-called “globalist cabal” and elites who conspire to stop his broadcasts in their war to stamp out alternative views. Yet, after the censorship spectacle and publicity stunts this past summer, Jones’ and the latest cohort of InfoWars Pages are now serving up millions of video views on not-so-secret channels. These include Facebook Pages named: “News Wars,” “InfoWars Stream,” and “InfoWars LIVE.”
The videos are now landing at the top of searches on Facebook for election-related news and conspiracy topics. Jones’ official Facebook profile and the majority of large InfoWars Pages were removed the first week of August; the same week the News Wars and InfoWars Stream Pages suddenly resumed their video-sharing activities (see below)
The (loop)hole Truth: Transparency Relies on Accuracy in Reported Metrics and Measurement.
Transparency. Truth. Credibility. These aren’t just journalistic buzzwords: they are part of a multilateral, cross-continental call to arms in the war to “save democracy,” ensure election integrity, and preserve American media institutions. Curbing the rise of disinformation, politically-motivated hate speech and violence — and stopping the “bots” and other coordinated propaganda — has tended to revolve around three criteria:
- The accuracy of the claims
- The authenticity of the sources
- The appeal of the messages
Finding truth in the surrounding contexts of shared information, however, especially on platforms like Facebook, often involves less quantifiable methods. We need more common sense approaches to monitor bad behaviors and manipulation efforts. We should always take into account the loopholes that are being used by these bad actors, however simple or primitive they might seem to be, as well as consider the conditions of the information’s reception by its target audiences. Meaning that, if conspiracy videos are seen as “censored” and part of a “pirate” radio operation by a celebrity “banned” actor, it’s likely to make them all the more popular.
Granular enforcement isn’t just reactive takedowns; it’s about proactive measures. It involves considering the factors — even the simple guerrilla marketing tactics — that play into how things like banned InfoWars live streams get further propagated.
From what I’ve seen in this extensive look into Facebook’s platform, especially in regards to the company’s capacity to deal with the misuse of its platform as shown in the cases above — exactly two years after the end of the last election — I argue that common sense approaches to platform integrity and manipulation still appear to be less of a priority for Facebook than automated detection and removal publicity.
One more example of this: Adding an “i” on algorithmically tagged news, often based solely on the associated domain, or adding “paid for by” labels on sponsored posts and political “ads” was a decent start. But it’s almost the end of 2018, and we’re still ending up with things like this:
Even for objectively false news like the Soros-YouTube-Facebook post seen above, there are crucial parts of Facebook’s initiative that aren’t really working. I’m also going to make the unpopular argument that shady foreign adversaries present less of a liability to the long-term integrity of the information sphere than the sort of ongoing cases I’ve shown in my last three posts looking at Facebook before the midterms.
The infinite gray area of information-sharing poses the real challenge: it’s the slippery soft conspiracy questions, the repetition of messages seen on shocking memes and statements like the “Soros Beto” caption above, and the emotional clickbait that’s regularly seen on Jones’ InfoWars video stills. Without granular enforcement, the non-foreign bad actors will only get better, and refine their tactics to increase Americans’ exposure to messages like this:
Information integrity is more than the scrutiny of provable statements or the linking of data to shared content with an “i.” Transparency involves more than verifying a Page administrator, putting the name alongside a date and voluntary disclosure for a paid ad campaign, and adding this data to a political ad archive.
As this third and final part of my look into Facebook is meant to drive home in a serious way, the real-world efficacy of information integrity initiatives — especially the ones launched by Facebook — are predicated on getting help from the American public in order to succeed. This help is conditional on two things:
- Being able to find the problem(s) on the platform; and
- Being able to trust the reported information, metrics and measurement data that we are provided with
This is where the notion of “accountability” is more useful than “transparency.” Accountability requires transparency; it’s not just about showing us some of the content, posts, and ads on Facebook — it’s about enabling access to this content, and about being sure that the information we’re getting from Facebook is accurate, and the techniques that were used to collect it are reliable.
The company removed 32 pages and accounts from Facebook and Instagram for "coordinated inauthentic behavior."www.wired.com
The seeds from which disinformation, conspiracies, propaganda bloom are increasingly activities that succeed not because of “ads,” but because of inauthentic participation hidden inside opaque products like Facebook’s Groups.
Accountability necessitates access, and timely, accurate reporting. As I’ve emphasized in this study of Facebook at the end of the 2018 midterm elections, transparency means little if there’s not adequate reporting, platform access, and proactive granular enforcement to go along with it.
I’ve clearly singled out Facebook in this study. But if we can’t find the problems or trust the data we’re getting, then we can’t begin to understand what’s really going on — and help to fix it.