This is the first installment of “The Micro-Propaganda Machine,” a three-part analysis critically examining the issues at the interface of platforms, propaganda, and politics. Read Part 2: “The Shadow Organizing of Facebook Groups” and Part 3: “Facebook’s Failure to Enforce Its Own Rules.”


Before the 2018 U.S. midterm elections, I took an extensive look into the state of Facebook’s platform and what I found was interesting—and terrifying. Three months and 1,000 screenshots later, my efforts involved collecting more than 250,000 posts, 5,000 political ads, and historic engagement metrics for hundreds of Facebook pages and groups using a diverse set of tools and data resources. Some of my findings were anticipated. Others were not.

The takeaway? It’s not good. Many of the dangers pointed out years ago seem to have grown exponentially on Facebook. But it’s not just isolated to Facebook.

As I told NBC’s David Ingram and The Hill’s Ali Breland, the visibility of extreme content and hate speech on Instagram was possibly the worst I’d ever seen. To make matters worse, there’s been a disturbing pattern of online polarization and radicalization followed by acts of ethnically and ideologically directed violence.

This study, however, is specifically about what was happening on Facebook in the days before the 2018 midterm elections. I won’t attempt to associate other platforms or theorize about the greater “why?” My work usually involves exploring cross-platform information flows and system-wide trends. The reason I chose to take a more narrow focus here is because I had more than enough data to do it—and what I found was important.

There are no rules stipulating that page manager accounts must be connected to the actual purchaser of the ads.

After this undertaking, one startling impression is that it’s the scale of the problems, not the sum of the problems, that represents the greatest threat. The issues I’ve found on Facebook the past few months—through large-scale analytics, content analysis, extensive political ad archive querying, and close inspection of thousands of posts and information-sharing activities—involve patterns that have been on the radar of the company’s leadership and American politicians since the 2016 election. They’ve been revisited in scores of hearings, broadcast on television, and recited around the country.

This project was extensive, so I’ve partitioned my findings into three parts, each focusing on a distinct set of challenges that face the platform. I see Facebook’s current challenges—in addressing the rise of extreme content, group conflict, and the coordinated manipulation of information—as best grouped into three categories: 1) recursive accountability, 2) shadow organizing, and 3) granular enforcement.

The findings presented in this first part are highly indicative of the company’s future success in its political transparency efforts and public accountability initiatives. The second part, focusing on shadow organizing, is meant to offer insights into coordinated efforts to manipulate users and the flow of information on Facebook. The third part looks at granular enforcement, focusing on the company’s challenges in enforcing its rules and terms of service.

Recursive ‘Ad-ccountability’

On a number of occasions, I found influential Facebook pages, including the verified pages of publishers and political funding groups being managed by accounts based outside the United States. To clarify, these don’t just involve politically themed pages with foreign managers, something BuzzFeed found and reported on in May. This is a much more concerning scenario: influential pages with foreign “manager” accounts that have been running extensive political ad campaigns on Facebook, targeting users in the United States over the past six months.

Some of these pages reported significant changes to the number of manager accounts and the locations of those accounts at the same time they ran targeted domestic political ad campaigns. There are known issues with Facebook’s political ad verification process, such as that it only requires one administrator for each page to get “verified” in order to be approved to run campaigns. And, of course, there are no rules stipulating that page manager accounts must be connected to the actual purchaser of the ads.

I found instances of influential pages with foreign manager accounts that have run targeted political campaigns without a “paid for” label.

The examples I’m sharing exhibit a pattern that reveals the structural “loopholes” in Facebook’s political ad disclosure system. The larger problem, however, which I’m calling “recursive ad-countability,” emphasizes that Facebook does not appear to have a rigid protocol in place to regularly monitor pages running political campaigns after the initial verification.

A secondary theme in the ad campaigns with foreign page managers I found was the use of information-seeking “polls”—aka sponsored posts asking their target audiences, in this case U.S. Facebook users, to respond to questions about their ideologies and moral outlooks. Last, but not least, I also found instances of influential pages with foreign manager accounts that have run targeted political campaigns without a “paid for” label.

While the findings below might not represent “peak threats” in Facebook’s destabilization of electoral processes, they are serious and have long-lasting implications for its accountability to the U.S. public.

After looking into Facebook’s platform, I found an alarming number of verified pages, including pages running large political ad campaigns, being managed by foreign accounts. I understand Facebook’s verification process only requires one manager account, but I find this problematic.

Here’s why: On September 12, I observed the Facebook page for the publishing conglomerate Liftable—a company that is now the owner of the right-wing Western Journal and has a long record of media acquisitions—being managed by 100 accounts representing at least seven different countries.

All screenshots: the author

Barely one month later, on October 13, Liftable’s verified page reported being managed by only 44 accounts, which were all based in the United States. Over the same time period, Liftable ran political ads, including targeted socially themed “polls,” that presented users with moral and ethical questions (see above). The company’s political ads, available through Facebook’s ad archive appear to date all the way back to March 1, 2018.

I understand that larger digital publishers are likely to have a diverse range of needs in running their business operations, and this is especially true after a series of acquisitions. So while the example here is Liftable’s page, I see this as a prime example of the lack of recursive accountability on Facebook’s part.

Liftable Facebook page — Information and Ads, October 11, 2018.

Similar to the case of Liftable, on September 12, the right-wing Conservative Tribune page showed 46 managers, with one manager based in Malaysia. BuzzFeed reported on this before, but the findings of this study showed that the verified page has also been running a long campaign of domestically targeted political Facebook “ads.”

Conservative Tribune Page, September 12, 2018

Like Liftable, this campaign involved targeted posts with questions and polls. One of Conservative Tribune’s sponsored posts targeted users, asking them to respond with their opinion about a complex question around gun control. One post included a suggestive image of a banned AR-style assault weapon, and referenced one of the country’s worst—and possibly its most politically controversial—mass school shooting.

The Conservative Tribune’s page managers also changed, showing 42 page managers, all of whom were based in the United States, on November 2.

Conservative Tribune Facebook page, November 2, 2018

Again, I understand publishers’ need for overseas contractors to assist with editing, design, and coding projects. But if only one account gets verified on a Facebook page to run political ads before an election, and we can’t even be sure who’s buying it beyond what they choose to type in a form field, then how are we supposed to interpret these kind of results?

Facebook’s Political Ad Archive [too many requests] error message, November 2, 2018

The third example is a Facebook page that, like the others, has run ads with a foreign manager. This time, however, it’s the verified profile of a political organizing group.

Below is the page for Reclaim New York, a nonprofit charitable organization that functions as a sort of statewide quasi-PAC. The group’s leadership is linked to a number of influential actors and funders, including the billionaire Mercer family, and Reclaim’s board members have included Steve Bannon and Kellyanne Conway—figures associated with both the 2016 Trump campaign and the now-defunct American branch of the Cambridge Analytica operation. The group also operates using similarly named pages, such as Reclaim NY, Reclaim NY Now, and The RNY Initiative.

On November 2, four days before the 2018 midterm elections, Reclaim New York’s Facebook page (seen above) still reported a manager from the United Kingdom. While this page was not running political ads at the time, it was as recently as September.

Looking at Facebook’s ad archive, Reclaim New York had previously run a number of political ad campaigns, two of which ran without “paid for” labels. Both of these appear to have taken place over a period of nearly four months.

The examples below show two of the page’s unlabeled ads—targeted posts about struggles with income, tax burdens, and New York cost of living. Both posts refer Facebook users to Reclaim’s website to input their income and living expenses into an “affordability calculator.”

While the funder of these ads is almost surely Reclaim New York’s corporate trust, the duration of the campaign, its reach to 25- to 44-year-old women in New York, and the purpose of the “affordability calculator” are noteworthy. This is yet another example of the many gaps in Facebook’s political ad system.

Example of Reclaim New York Facebook page targeted “affordability calculator” post that ran without a paid for label.

The last example from my findings is LifeZette, a conservative lifestyle publisher catering to a millennial, female, family-oriented demographic. At the time of my first observation in September, as well as in a follow-up on November 2 (see below), the page was being managed by accounts based in Spain and the U.K. Like the previous examples, this verified page was also running an extensive political ad campaign at the time. One of the more than 37,000 political “ads” included a post targeting female American Facebook users with stories about addiction and rehabilitation.

Lifezette Facebook Page, November 2, 2018

After finding these huge discrepancies, I found it difficult to trust any of Facebook’s reporting tools or historical page information. Based on the sweeping changes observed in less than a month for two pages, I knew the information reported in the follow-ups was likely to be inaccurate.

What does it actually mean that pages are managed by accounts in different countries one day and only one country (the United States) a few days later? What do changes in numbers of managers over short periods of time mean?

Facebook’s political ad transparency tools—all of them—offer no real basis for evaluation. There is also no ability to know the functions and differential privileges of these page “managers” or see the dates managers are added or removed from the pages.

Are managers outside the United States involved at any step in the targeted campaigns? This includes having access to campaign orders, the personal information (of Americans), the Facebook Pixels and user ID trackers handled elsewhere, and any campaign response data obtained through poll responses and referrals to the page.

While the findings here are interesting cases, they shouldn’t be taken out of context.

First, I’m not suggesting that since these pages have had foreign accounts managing them at the same time they’ve run targeted political Facebook campaigns it means they are guilty of foreign election interference. I chose these example pages because the organizations they represent are influential, and the actors behind the pages are actively involved in American politics. All were observed as being managed by a number of foreign accounts; all have run targeted political campaigns on Facebook within the past three months; and all but one page, Reclaim New York, were still running ads on November 2, less than a week before the midterm elections.

Three out of four campaigns shown in this study used targeted posts seeking additional information from users in the form of poll responses and self-reported income.

I also realize all these example pages happen to be associated with conservative media, funders, and organizations. I’ll be honest: I looked at several hundred political pages on Facebook and did not find any left-leaning or “liberal” pages that had the same kinds of issues. Of course, this does not mean they don’t exist; there are almost certainly similar cases to be found with left-leaning Facebook pages. I just didn’t come across them.

Liftable, Conservative Tribune, Reclaim New York, and LifeZette were used in this study because they are verified pages that make an important case for the many problems with Facebook’s political ad transparency initiative. Together, they show a consistent and troubling pattern of opaque reporting on the accounts that are managing the pages of influential actors who are spending money to actively shape U.S. politics and election results.

Three out of four campaigns shown in this study used targeted posts seeking additional information from users in the form of poll responses and self-reported income. In the cases of Liftable and Conservative Tribune, there were unreported changes to both the number of accounts and the originating countries of the page managers at the same time the pages were running political ad campaigns. LifeZette’s page has run campaigns since at least September with two foreign managers. As of November 2, it appears to have run 37,000 different versions of targeted political ads.

Changes to page names on Facebook can be viewed, but the most crucial information for political transparency purposes is the location and number of accounts actively managing these pages and access to campaign-related information, including page referrals, likes and follows, and the personal data of Americans frequently used for refined campaign targeting. Unfortunately, none of this information is provided.

For the pages in this study, the information reported by Facebook’s tools—including its Info and Ads feature and the data from its political ad archive—was not only inaccurate, it was misleading. The historical inconsistencies in reporting and the lack of rigidity in its verification process are gaps that arguably render the company’s political ad transparency effort ineffective. That’s a problem that needs to be resolved.