Member preview

The 2018 Facebook Midterms, Part II: Shadow Organizing

🗳🗳

In 2016, in our discussions about Facebook and the election, we tended to focus mostly on Pages. And paid “ads.” Well, it’s 2018, and this time around, we have another problem to talk about: Facebook Groups. In my extensive look into Facebook, introduced in the previous post, I’ve found that groups have become the preferred base for coordinated information influence activities on the platform. This is a shift that reflects the product’s most important advantage: the posts and activities of the actors who join them are hidden within the Group. Well, at least until they choose to share them.

Inside these political Groups, numbering anywhere from the tens of thousands to the hundreds of thousands of users, activities are perfectly obscured. However, as I will show, the effects of these activities can be significant. The individual posts, photos, events, and files shared within these groups are generally not discoverable through the platform’s standard search feature, or through the APIs that allow content to be retrieved from public Facebook pages. Yet once the posts leave these groups, they can gain traction and initiate large-scale information-seeding and political influence campaigns.

As a result, the actors who used to operate on Pages have now colonized Groups and use them more than ever. This analysis found disinformation and conspiracies being seeded across hundreds of different groups, most falling into what would best be described as political “astroturfing.”

Members of groups often reappear across related political groups, and in some cases, play roles as curators and “translators,” reminding others to screenshot and cut-and-paste shared content into their “own posts” to circumvent Facebook’s automated detection mechanisms, to avoid censorship, and to prevent future attribution of their posts. This is shown in the example below:

🃏Part II: Shadow Organizing

This complex platform manipulation problem, which I’m calling “shadow organization,” is the second of three categories of challenges I’m focusing on in my look at Facebook’s platform before the midterm elections. In this post, I discuss a number of examples concerning an emerging trend in how political organizing efforts have shifted on Facebook since 2016.

I’ll start by being blunt: seeding political ideas and conspiracies on Facebook has never been that difficult. But they used to be easier to find. Groups change this dynamic, leveling the playing field for those who seek to peddle unreliable information, hyper-partisan news, rumors, and conspiracy theories. As of 2018, Groups now play a major role in manipulation, helping to push ideas at the right place and at the right time across the Facebook platform-at-large.

The best example of this type of Group operation succeeding in campaign form is the Soros-funded caravan” rumor. On Twitter, it was a fairly straightforward process to trace the original posts on the platform. And if it got too difficult, you could always buy all the data. On Facebook, however, and despite the fact there was a great deal of cross-platform sharing and post screenshot-dropping occurring, most of the posts and replies that appeared after the initial reports of the first migrant “caravan” (at the end of March, as shown below) were only found within Groups.

Finding the posts with the regular search feature, or trying to monitor all the pages that might have re-shared these posts and pull them through Facebook’s API does not lead you to them. I’ll say it again: the earliest of the publicly shared seeds of the Soros “caravan” rumor are only found within Facebook Groups.

Group posts suggesting the “soros-migrant caravan” funding link on 31 March 2018

Groups represent a huge tactical shift for influence operations on the platform. They allow bad actors to reap all of the benefits of Facebook— including its free unlimited photo and meme image hosting, its Group-based content and file sharing, its audio, text, and video “Messenger” service, mobile phone and app notifications, and all the other powerful free organizing and content promoting tools, with few — if any  of the consequences that might come from doing this on a regular Page, or by sharing things out in the open.

Furthermore, these Group influence operations are being indirectly sponsored by Facebook’s advertisers. And it means that bad actors have less of a need to spend large amounts of money on digital organizing infrastructure, file-sharing space and servers, or run traditional marketing campaigns on the platform to reach and influence American voters.

Facebook’s Groups offer all of the benefits with none of the downsides. Posts shared to the Group are essentially private until the time comes when the users take strategic action to make them public. For an example of this, see the enlightening Group post below:


As Facebook’s policing of its open platform began to clamp down on the most obvious actors and fake Pages following the last election, it was only a matter of time until the bad actors moved into Groups and started using them to coordinate their political influence operations and information-manipulation campaigns. We’re there now.

👓Research and Facebook Groups

An accountability consequence of the move by bad actors into Facebook Groups is that researchers who aren’t sponsored/Facebook-sanctioned and privileged with special inside access, are relegated to tracing the spread of disinformation, hate speech, and political propaganda operations using a stack of mostly third-party tools. As mentioned earlier, to locate the earliest posts about the Soros caravan rumor on Facebook, I had to use a large stack of different tools and data-sifting techniques.

It was only after going through thousands of posts across dozens of Facebook Groups that I found some (but not all) of the early seeder accounts responsible for the Soros-funded “caravan” conspiracy on the platform, a controversy that has ended up being one of the pivotal election issues, and has conveniently opened the immigration and “open borders” mega-controversy right at the end of the 2018 midterm elections.

To recap the second set of challenges, shadow organizing, my findings show that Groups have taken Facebook’s information battle from out in the open — and on Pages — into rooms in the basement. Manipulation activities are buried inside thousands of Groups, and a large number of “Groups” are operating as opaque subpages of regular — sometimes banned — Facebook Pages. Many of these groups are “closed,” and many show the exact same managers and administrators as their sponsor Pages.

This means that in 2018, the sources of misinformation and origins of conspiracy seeding efforts on Facebook are becoming invisible to the public — meaning anyone working outside of Facebook. Yet, the American public is still left to reel in the consequences of the platform’s uses and is tasked with dealing with its effects. The actors behind these groups whose inconspicuous astroturfing operations play a part in seeding discord and sowing chaos in American electoral processes surely are aware of this fact.

🎉Unmoderated Groups

I’ve seen a pattern of Groups over the past couple weeks without any admins or moderators. These “no admin” groups are a wonderful asset for shadow political organizing on Facebook. They are an increasingly popular way to push conspiracies and disinformation. And unmoderated groups — often with of tens of thousands of users interacting, sharing, and posting with one other without a single active administrator are allowed. I confirmed. They are even documented in Facebook’s FAQs. See the example below:

Example of large political Facebook Group with no admins or moderators. 1 Nov 2018.

🆘Groups and Extreme Content

As you might expect, the posts and conversations in these Facebook Groups appear to be even more polarized and extreme than what you’d typically find out on the “open” platform. And a fair portion of the activities appear to be organized. After going through several hundred Facebook Groups that have been successful in seeding rumors and in pushing hyper-partisan messages and political hate memes, I repeatedly encountered examples of extreme content and hate speech that easily violates Facebook’s terms of service and community standards. Some of the examples I found are shown below:

📝Shadow Organizing: Discussion

The move to these unmoderated and closed Groups means that bad actors have indeed had a “great awakening” — a wonderful new opportunity to influence others using Facebook. Even if you can find their real profiles, the ones that have been seeding disinformation — or perhaps one of the many duplicate profiles that they’ve managed to buy or control somehow — *and* all of their posts are public — the posts to fully public Groups can be easily set so they do not show up in their regular timelines.

Example of meme and commentary found in a influential political Facebook Group in early November.

It’s the perfect storm: users’ activities and posts to Groups are kept private, but the potential of their messages and memes to reach everyone else is still there. This means, unlike other social platforms like Twitter, Reddit, WhatsApp, and Instagram, Groups on Facebook have all of the advantages of selective access to the world’s largest online public forum. So, we can talk about how scary WhatsApp is in other countries, and how Twitter might play a leading role in the United States elections, but it is Facebook’s Groups— right here, right now — that I feel represents the greatest short-term threat to election news and information integrity. Groups like the ones shown below — without a single admin or moderator are literally popping up everywhere.

It’s like the worst-case scenario from a hybrid of 2016-era Facebook and an unmoderated Reddit.

Example of Facebook Groups that haven’t replaced the default header image. Most have no moderator or admin accounts.

It’s obvious to me there has been a large-scale effort to push messages out from these Facebook groups into the rest of the platform. I’ve seen an alarming number of influential Groups, most of which list their membership number in the tens of thousands of users, that seek to pollute information flows using suspiciously inauthentic but clearly human operated accounts. They don’t spam messages like what you’d see with “bots”; instead they engage in stealth tactics such as “reply” to other group members profiles with “information.”

A variety of elementary cryptography tricks and activity-obscuring schemes are also happening within these groups. Simple stuff. Smoke and info-mirrors that gets people engaged and gives them the impression that it’s secret — and that they are part of some kind of clandestine operation for the greater good. It’s an ingenious scheme: a political marketing campaign for getting the ideas you want out there at exactly the right time.

You don’t need to go digging in Reddit, or 4 or 8 Chan, or crypochat for these things anymore. You’ll see them everywhere in political Facebook Groups:

What’s the Problem with Groups?

How has the use of Groups changed since the last election? Well, it’s always been there. It’s just become much larger. The recent shift into groups presents a multifaceted problem. It means that the posts and initial coordination that seeds the conspiracies has become less discoverable, and it also means the actors working behind the scenes to push these things are less visible. This also means these things are less likely to be found by independent researchers and journalists, so what happens within them presents less of a liability for Facebook’s communication, legal, and PR teams. On the surface, what happens inside Groups is less “bad.”

Groups present yet another challenge for solving the underlying platform problem: they function as an anti-transparency feature, inhibiting much of the help that Facebook might get in its long-term battle against bad actors, especially for finding those responsible for polluting and disrupting the environment around shared information and politics on the platform. This help includes assistance from regular citizens, government and state agencies, professional journalists, and independent researchers.

Facebook Group-originated disinformation and “hate meme”-seeding campaigns means there’s no longer a clear path to identifying the root causes — including the sources, tactical mechanisms, accounts, replies, posts, and shares — that contribute to these conspiracies, or facilitate the polarization that has ended with some of the tragic consequences we’ve seen recently in places such as Florida and Pittsburgh. Sure, the trails can be found, but the origins of the campaigns are increasingly shrouded inside thousands of active Facebook groups. What’s more, most of these Groups have been populated with a perfect mix of real and inauthentic user accounts.

🎤Conclusion

The takeaway from the second part of this study: two days before the 2018 midterm elections, Facebook doesn’t seem to be any “better.” Due to the size of its user base and the sheer complexity of its problems, products, and platform design, it has a long and winding road ahead.

As of late, Facebook’s platform has been flooded with hate-mongering political rumors and seeded rumors and conspiracies. Facebook’s Groups have been populated by political astroturfing “rally” armies. These faux-grassroots efforts are pushing out massive quantities of suspiciously pro-American, anti-migrant, Soros caravan-funding, “Defend our Border” and “anyone-but-the-Globalists” memes. These frequently end up being spread through Facebook.

Now I can’t speak to whether it’s worse in quantity, or if the percentage of hate content, extremist memes, and political propaganda in Groups is proportionally higher or lower that what we saw on Pages the last time around. I can’t make these types of claims in the first place, since we have no idea as to the amount of content and accounts that Facebook has already flagged and taken down, or how posts are being de-promoted and down-ranked internally by Facebook’s own teams.

But it seems to me that Groups are the new problem — enabling a new form of shadow organizing that facilitates the spread of hate content, outrageous news clips, and fear-mongering political memes. Once posts leave these Groups, they are easily encountered, and — dare I say it — algorithmically promoted by users’ “friends” who are often shared group members — resulting in the content surfacing in their own news feeds faster than ever before. Unlike Instagram and Twitter, this type of fringe, if not obscene sensationalist political commentary and conspiracy theory seeding is much less discoverable.

While automation surely plays a role in the amplification of ideas and shared content on Facebook, the manipulation that’s happening right now isn’t because of “bots.” It’s because of humans who know exactly how to game Facebook’s platform. And this time around, we saw it coming, so we can’t just shift the blame over to foreign interference. After the midterm elections, we need to look closely, and press for more transparency and accountability for what’s been happening due to the move by bad actors into Facebook Groups.