Throughout the past month, peaceful democratic gatherings have been co-opted, online and offline, by extremists leveraging violence, destruction and disinformation in efforts to amplify polarization and hate. These manipulative, non-monolithic groups — including white-supremacists posing as Antifa and anarchists pushing for wanton destruction — are seeking to divide and destroy our ability to peacefully protest from within.
The ongoing unrest across U.S. cities stems from legitimate protest, but extremist groups are working to co-opt tensions and inflame polarization through online manipulation. Meme-based (or memetic) radicalization tactics are currently fueling efforts aimed at co-opting largely peaceful demonstrations through incendiary viral digital poison and corresponding offline organization.
A recent tweet from an account with the handle “Antifa America” (see image below) describes an imminent plan to spread riots and looting into white U.S. suburbs. This message quickly spread through group chats and comment threads. The problem? The account was exposed as a hoax fabricated by the white supremacist group Identity Evropa. Twitter has since banned the account, and NBC quickly posted a debunk. The falsehood was uncovered — but the screenshots live on.
A variety of social media platforms are currently implementing search-and-destroy takedown efforts during the current protests against anonymous political groups working to exacerbate division in both online and offline communities. Memetic distribution, in which users take a quick screenshot and upload the image rather than sharing the original post, allows content to quickly diffuse from peer-to-peer without any link to the initial source. By hiding the chain of attribution, digital manipulators can trick unsuspecting users into spreading inauthentic content — evading content moderators and fact checkers with ease. Disinformation becomes misinformation.
When most people think of digital manipulation they envision Russian-created troll accounts amplified by shadowy networks of bots and foreign operatives. However, social media-fueled unrest surrounding the protests has exposed the other side of disinformation: memetic radicalization, spread by home-grown groups, not just foreign actors. Rather than relying on basic automated accounts — political bots relatively easy to identify and remove from mainstream social media platforms — today’s actors have increasingly turned to memetic abstraction, methods for propagating messages without leaving a paper trail.
Here, a Facebook meme page alleges that police have placed pallets of bricks around protest sites in an effort to incite violence and delegitimize the demonstrations. In the comments, another screenshot from a separate Facebook post claims to offer evidence of the tactic in action — further reinforcing the allegation without actually providing a link, context or substantive proof.
Explicit instructions are embedded in the main image itself, spurring users to spread the image while severing any metadata and disguising the pattern of propagation. This call to action is designed to travel with the meme, boosting its effectiveness and longevity.
Researchers at organizations including Data and Society’s Media Manipulation Project, Harvard’s Technology and Social Change Project, and the social network analysis firm Graphika have pioneered this type of socio-technical analysis aimed at exploring how memes and other digital media are leveraged in efforts to co-opt, control, and coerce digital communication.
The posts we’ve extracted here exemplify the ways in which extremist actors are currently using memes to subvert legitimate movements, exacerbate stereotypes, and — perhaps most concerningly — instruct one another on how to spread untraceable false narratives to mass audiences.
Three key lenses for analyzing memetic internet content:
Access — What groups are circulating which content? Was the media distributed via a link, by sharing the actual image, or other? Was there a paywall for access?
Attribution — Who originally created the meme? Who posted the meme? Who took credit for the meme?
Ownership — What platform is the meme present on, who has monetized the distribution, who has control of the deletion of the post?
Extremists are employing these tactics on a large scale, disrupting and discrediting legitimate protest communications across Twitter, Reddit, YouTube and other platforms. Strategically deployed bot networks aid in amplifying decontextualizing violent actions from looters and cops alike, co-opting isolated moments taken out of context. Other accounts make community specific calls for bloodshed. One meme we’ve tracked calls for U.S. truckers to mobilize and plow down protestors who are blocking roads.
This video has spread through numerous distinct devices and channels,as shown but the multiple layered Snapchat overlays in the top right. This creates a virtually impossible-to-track chain of propagation around an incendiary claim about an attack on an elderly woman. Without context, content like this can serve to exacerbate pre-existing stereotypes about protestors and foster further division between digital tribes.
There is a long and sordid history of the ‘outside agitator’ label being employed by those seeking to discredit protestors. Because of this, there are manifold reasons to be wary of that narrative. But it is true that extremist groups online are always working to perfect tactics of sowing polarization and hate. The fact remains that the lion’s share of today’s protests — on and offline — are wholly legitimate and seek to voice long-standing grievances about racial injustice. But those on the fringes prey on the quick online glance and share, they seek out susceptible individuals in efforts to gain a disproportionate amount of attention for disinformation and propaganda.
These malicious organizations — which include clear-cut white supremecist groups as well as armed militias that claim a host of seemingly contradictory beliefs — take advantage of political crises. They are particularly focused on recruiting young, often male, members. The tactics they employ are easy to recognize from our experience countering ISIS recruitment: wherever pent-up, frustrated youth with limited future prospects begin focusing their frustration on an invisible oppressor, malicious agents move in quickly to incite hate amongst them.
At the height of their influence in 2014 and beyond, Islamic State operatives using ‘honeypot’ accounts on social media platforms saw Western Muslim youth as ideal targets for digital radicalization because they face discrimination, unemployment and a scarcity of hope for the future. Since 2016, with a crescendo reached over the last week, similar cult-like tactics have been deployed by white nationalists, anarchists and foreign agents stoking division and extremism inside our communities.
When COVID-19 disrupted employment, education and dreams for the foreseeable future, it created a new pool of targets for radicalization in the United States. Many of the young people recently ‘recruited’ to these groups face spiking unemployment, interrupted education and — due to necessary quarantines — increasingly digital social lives. With fewer productive arenas where they can direct their ideas and energy, combined with increased screen time, new segments of America’s youth are being exposed to the soup of extremist memes online, including the ones captured here. The challenges they face certainly do not forgive their actions, but they highlight a larger trend associated with a need-to-belong accompanied by — and often concretizing — racisct, homophobic, and xenophobic socio-political beliefs.
The tactics fueling the current digital firestorm are extremely difficult to detect, and are seem to be profoundly effective in stoking hate and division. Preprocessed packages of content, purposefully tailored for memetic distribution originating from unknown sources, are quickly picked up and spread by unknowing everyday users. The extremists and others that coordinate these actions time and plan their posts in efforts to hijack social trends — with the ultimate goal of having these gamed trends picked up as fodder for news reports. It’s a cycle that moves from the fringes into the mainstream, what scholar Whitney Philips has described as granting undue ‘oxygen’ and ‘amplification’ to antagonistic political groups.
It’s difficult to keep up with this kind of complex, memetic radicalization, but researchers and strategists around the world are actively developing new methods for countering such tactics. What resources are out there and where can we turn for help? Dr. Joan Donovan and her team at Harvard have produced excellent work on the weaponization of memes (including Meme War Weekly), and organizations including First Draft and the Columbia Journalism Review have built invaluable empirically-informed resources useful in disrupting the next wave of disinformation.
Co-author Ben Cook has designed a suite of strategic services and behavioral approaches to combatting disinformation and radicalization at a human level. This research is supported by the Propaganda Research Team at UT Austin’s Center for Media Engagement, which has recently launched new research projects investigating the manipulative political use of encrypted messaging applications and geo-location.