This is the second installment of The Micro-Propaganda Machine, a three-part analysis critically examining the issues at the interface of platforms, propaganda, and politics.


In 2016, discussions about Facebook and the election tended to focus mostly on pages and paid ads. Well, it’s 2018, and this time around, we have another problem to talk about: Facebook groups.

In my extensive look into Facebook, I’ve found that groups have become the preferred base for coordinated information influence activities on Facebook. This shift reflects the product’s most important advantage: The posts and activities of the actors who join them are hidden within the group. Until they choose to share them, that is.

Inside these political groups—numbering anywhere from tens of thousands to hundreds of thousands of member users—activities are perfectly obscured. However, the effects and consequences of these activities can be significant. The individual posts, photos, events, and files shared within these groups are generally not discoverable through Facebook’s standard search feature or through the APIs that allow content to be retrieved from public pages. Yet once the posts leave these groups, they can gain traction and initiate large-scale information-seeding and political influence without being easily traced.

As a result, actors who used to operate on pages have now colonized groups, and they’re using them more than ever. My analysis found disinformation and conspiracies being seeded across hundreds of different groups—most falling into what would best be described as political “astroturfing.” (Astroturfing it when an organization gives the appearance that it’s a grassroots org started by everyday citizens, when in reality, it’s being funded by motivated sponsors.)

Members of groups often reappear across related political groups and, in some cases, play roles as curators and “translators,” reminding others to screenshot and cut-and-paste shared content into their own posts to circumvent Facebook’s automated detection mechanisms, to avoid censorship, and to prevent future attribution of their posts.

All screenshots: the author

This complex platform manipulation problem, or “shadow organization,” is the second of three categories of challenges I identified while looking at Facebook’s platform before the midterm elections. There are a number of examples of an emerging trend in how political organizing efforts have shifted on Facebook since 2016.

To be blunt, seeding political ideas and conspiracies on Facebook has never been difficult. But they used to be easier to trace. Facebook groups change this dynamic, leveling the playing field for those who seek to peddle unreliable information, hyperpartisan news, rumors, and conspiracy theories. As of 2018, groups now play a major role in manipulation, helping to push ideas at the right place and at the right time across the Facebook platform.

The best example of this type of group operation succeeding is the “Soros-funded caravan” rumor. On Twitter, it was a fairly straightforward process to trace the original posts. On Facebook, however, most of the posts and replies that appeared after the initial reports of the first migrant “caravan” (at the end of March) were only found within groups.

Finding the posts with the regular search feature or trying to monitor all the pages that might have reshared these posts and pull them through Facebook’s API does not lead you to them. The earliest of the publicly shared seeds of the Soros caravan rumor can only be found within Facebook groups.

Group posts suggesting the “soros-migrant caravan” funding link, March 31, 2018

Groups represent a huge tactical shift for influence operations on the platform. They allow bad actors to reap all the benefits of Facebook—including its free unlimited photo and meme image hosting; its group-based content and file sharing; its audio, text, and video messenger service; mobile phone and app notifications; and all the other powerful free organizing and content promoting tools—with few, if any, of the consequences that might come from doing this on a regular page or by sharing things out in the open.

Furthermore, these influence operations are being indirectly sponsored by Facebook’s advertisers. This means bad actors have less need to spend large amounts of money on digital organizing infrastructure, file-sharing space and servers, or traditional marketing campaigns on the platform to reach and influence American voters.

Facebook’s groups offer all the benefits with none of the downsides. Posts shared to the group are essentially private until the time comes when users take strategic action to make them public.


As Facebook’s policing of its open platform began to clamp down on the most obvious actors and fake pages following the 2016 election, it was only a matter of time until the bad actors moved into groups and started using them to coordinate their political influence operations and manipulation campaigns.

It’s happening now.

An accountability consequence of the move into Facebook groups is that researchers who aren’t sponsored/Facebook-sanctioned and privileged with special inside access are relegated to tracing the spread of disinformation, hate speech, and political propaganda operations using a stack of mostly third-party tools. To locate the earliest posts about the Soros caravan rumor on Facebook mentioned earlier, I had to use a large stack of different tools and data-sifting techniques. It was only after going through thousands of posts across dozens of Facebook groups that I found some (but not all) of the early seeder accounts responsible for the Soros-funded caravan conspiracy on the platform—a controversy that ended up being a pivotal election issue and conveniently re-opened the immigration and “open borders” mega-controversy right ahead of the 2018 midterm elections.

My findings show that groups have taken Facebook’s information battle from out in the open (on pages) into hidden rooms (groups) in the basement. Manipulation activities are buried inside thousands of groups, and a large number of groups are operating as opaque subpages of regular—sometimes banned—Facebook pages. Many of these groups are closed, and many show the exact same managers and administrators as their sponsor pages.

So the sources of misinformation and origins of conspiracy seeding efforts on Facebook are becoming invisible to the public—meaning anyone working outside of Facebook. Yet, the American public is still left with the consequences of the platform’s uses and its effects. The actors behind these groups whose inconspicuous astroturfing operations play a part in seeding discord and sowing chaos in American electoral processes surely are aware of this.

I’ve also seen a pattern of groups without any admins or moderators. These “no admin” groups are a wonderful asset for shadow political organizing on Facebook. They are an increasingly popular way to push conspiracies and disinformation without consequence. And unmoderated groups—often with tens of thousands of users interacting, sharing, and posting with one another without a single active administrator—are allowed. They are even documented in Facebook’s FAQs.

Example of large political Facebook Group with no admins or moderators, November 1, 2018

As you might expect, the posts and conversations in these Facebook groups appear to be even more polarized and extreme than what you’d typically find out on the open platform. And a fair portion of the activities appear to be organized. After going through several hundred Facebook groups that have been successful in seeding rumors and pushing hyperpartisan messages and political hate memes, I repeatedly encountered examples of extreme content and hate speech that easily violates Facebook’s terms of service and community standards.

The move to these unmoderated and closed groups means that bad actors have had a great awakening, a wonderful new opportunity to influence others using Facebook. Even if you can find their real profiles, the ones that have been seeding disinformation—or perhaps one of the many duplicate profiles they’ve managed to buy or control somehow—and all of their posts are public, the posts to fully public groups can easily be set so they do not show up in their regular timelines.

Example of meme and commentary found in a influential political Facebook group, early November

It’s the perfect storm: Users’ activities and posts to groups are kept private, but the potential of their messages and memes to reach everyone else is still there. This means, unlike other social platforms like Twitter, Reddit, WhatsApp, and Instagram, groups on Facebook have all the advantages of selective access to the world’s largest online public forum.

We can talk about how scary WhatsApp is in other countries and how Twitter might play a role in U.S. elections, but it’s Facebook’s groups—right here, right now—that represent the greatest short-term threat to election news and information integrity. Groups like the ones shown below, without a single admin or moderator, are popping up everywhere.

It’s like the worst-case scenario from a hybrid of 2016-era Facebook and an unmoderated Reddit.

Example of Facebook groups that haven’t replaced the default header image. Most have no moderator or admin accounts.

There has obviously been a large-scale effort to push messages out from these Facebook groups onto the rest of the platform. An alarming number of influential groups, most of which list their membership in the tens of thousands, seek to pollute information flows using suspiciously inauthentic but clearly human-operated accounts. They don’t spam messages like what you’d see with bots; instead, they engage in stealth tactics such as replying to other group members’ profiles with “information.”

A variety of elementary cryptography tricks and activity-obscuring schemes are also happening in these groups. Simple stuff. Smoke and info-mirrors get people engaged and give them the impression that it’s secret, that they’re part of some kind of clandestine operation for the greater good. It’s an ingenious scheme: a political marketing campaign for getting the ideas you want out there at exactly the right time.

You don’t need to go digging in Reddit or 4Chan or 8Chan for these things anymore. They’re everywhere in political Facebook groups.

Facebook groups have always been there, but since the 2016 election they’ve become much larger. The shift into groups presents a multifaceted problem because the posts and initial coordination that seeds the conspiracies has become less discoverable and the actors working behind the scenes are less visible. These things are less likely to be found by independent researchers and journalists, so what happens within them presents less of a liability for Facebook’s communication, legal, and PR teams.

Thus, groups present yet another challenge for solving the underlying platform problem: They function as an anti-transparency feature, inhibiting much of the help that Facebook might get in its long-term battle against bad actors, finding those responsible for polluting and disrupting the environment. This help includes assistance from regular citizens, government and state agencies, professional journalists, and independent researchers.

Facebook group-originated disinformation and hate meme campaigns mean there’s no longer a clear path to identifying the root causes—including the sources, tactical mechanisms, accounts, replies, posts, and shares—that contribute to these conspiracies or facilitate the polarization that has led to tragic consequences, such as we’ve seen recently in Florida and Pittsburgh. Sure, the trails can be found, but the origins of the campaigns are increasingly shrouded inside thousands of active Facebook groups. What’s more, most of these groups have been populated with a perfect mix of real and inauthentic user accounts.

We have no idea the amount of content and accounts Facebook flagged and took down or how posts are being de-promoted and down-ranked internally by Facebook.

The takeaway is this: Two years have passed since the 2016 election, and Facebook doesn’t seem to be any better. Due to the size of its user base and the sheer complexity of its problems, products, and platform design, it has a long and winding road ahead.

In the runup to the 2018 midterm elections, Facebook’s platform was flooded with hate-mongering political rumors and seeded conspiracies. Facebook’s groups have been populated by political astroturfing “rally” armies. These faux-grassroots efforts push out massive quantities of suspiciously pro-American, anti-migrant, Soros caravan-funding, “defend our border,” and “anyone-but-the-globalists” memes. These frequently end up being spread throughout Facebook.

It remains unclear whether it’s worse in quantity or if the percentage of hate content, extremist memes, and political propaganda in groups is proportionally higher or lower than what we saw on pages. After all, we have no idea the amount of content and accounts Facebook flagged and took down or how posts are being de-promoted and down-ranked internally by Facebook. But it seems that groups are the new problem—enabling a new form of shadow organizing that facilitates the spread of hate content, outrageous news clips, and fearmongering political memes. Once posts leave these groups, they are easily encountered and algorithmically promoted, resulting in the content surfacing in news feeds faster than ever before.

While automation surely plays a role in the amplification of ideas and shared content on Facebook, the manipulation that’s happening right now isn’t because of “bots.” It’s because of humans who know exactly how to game Facebook’s platform. And this time around, we saw it coming. We can’t just shift the blame over to foreign interference. We need to look closely and press for more transparency and accountability for what’s been happening in Facebook groups.