How Tech and Media Enabled a White Supremacist Coup

Jessie Daniels, PhD
Data & Society: Points
5 min readJan 13, 2021
photograph of the capitol building
Image via @Josenothose — Unsplash

On January 6, an armed mob of Trump supporters, Proud Boys, QAnon conspiracy theorists, and assorted white supremacists entered the Capitol building, overrunning the modest security forces.

While many people expressed shock and named the rampage “unprecedented,” historian and writer Jelani Cobb cautioned against that characterization when he tweeted “Before anyone calls this unprecedented please don’t. White supremacists have attempted to (and succeeded in) overthrowing elections previously. Especially when they believe Black people have too much political power. Start with the Wilmington coup in 1898.” The coup attempts we are witnessing now, the one on January 6, 2021 and the on-going attacks on state capitols around the U.S., are part of this history. They differ only because of the algorithmic internet, which acts as an accelerant for white supremacy.

They differ only because of the algorithmic internet, which acts as an accelerant for white supremacy.

Cobb is right, of course. The mob that erected a noose and gallows, chanting “Hang Mike Pence,” through the halls of the Capitol last week is the latest, and certainly not the last, iteration in a long history of white attacks on government. And more often than that, white people have violently attacked Black, Indigenous, and other people of color who white people believe have risen above their station in life. During the Red Summer of 1919, hundreds of Black people were lynched, and thousands of Black-owned homes and businesses were destroyed. In places such as Washington D.C., Tulsa, Oklahoma and Rosewood, Florida, white people would rather burn whole neighborhoods to the ground than let Black people prosper.

What’s different this time, is that we have the algorithmic internet. This ability to be connected all the time through social media and mobile phones has enabled not only the wisdom of crowds and smart mobs; these technologies have also enabled tech-augmented lynch mobs. And, unlike the jobs of a previous era, today’s white supremacy is globally networked, a fact I pointed out in a 2009 book, Cyber Racism.

These present-day lynch mobs have been incited to violence over the last four years through a network of right-wing platforms, such as Parler, and media outlets, such as FoxNews America’s most-watched cable news channel, which are funded by billionaires like Rebekah Mercer and Rupert Murdoch. The current tech/media ecosystem combined with violent white supremacy is what’s truly unprecedented about the current moment.

These present-day lynch mobs have been incited to violence over the last four years through a network of right-wing platforms…

In the early days of the popular internet, experts said we were moving from a “one-to-many”model of broadcast communication (like the big three television networks, ABC, CBS, NBC) to a “many-to-many” paradigm where we’re all sharing peer-to-peer (anyone remember Limewire?). But the ecosystem we live in now combines these and weaponizes them. The more outrageous a tweet from a powerful politician the wider the coverage it receives from traditional media outlets like the New York Times and the cable news networks, creating a symbiotic relationship that has come to define the moment we’re in. Within that ecosystem, conspiracists toil 24/7 to decode mainstream news for confirmation of their wild-eyed theories about what “they don’t want you to know.”

Long before the events at the Capitol last week, President Trump was adept at using both white supremacy and the tech/media ecosystem. We’ve even seen this adeptness result in incitement to violence against members of Congress before.

In April 2019, President Trump tweeted an edited video of a speech by Rep. Ilhan Omar, a Muslim woman from Minnesota, in between footage of the attacks on the World Trade Center on 9/11. The video takes a quote from a longer speech, remixes it with some of the most graphic images from 9/11, and packages it into a menacing visual media collage that is both dehumanizing and dangerous. When Trump tweeted the unsourced video to his millions of followers, many thought his tweet amounted to inciting violence against Rep. Ilhan Omar. His attack on her came just weeks after the Christchurch, NZ massacre that left 50 Muslim worshippers dead. The shooter said that he took inspiration from Trump. Here in the U.S., hate crimes increased each year following Trump’s election. Certainly, Trump’s tweets are correlated to this and not a direct cause, but there is research to suggest the correlation matters.

Susan Benesch, in her research on what she calls dangerous speech, writes that “when an act of speech has a reasonable chance of catalyzing or amplifying violence by one group against another, it is Dangerous Speech.” She has studied the phenomenon in countries around the world, and finds a consistent pattern in the way that political rhetoric turns the corner toward violence, and then genocide. Tech and media has the power to speed up the spread of dangerous speech and amplify it.

Tech and media has the power to speed up the spread of dangerous speech and amplify it.

After the violent mobs at the Capitol, Twitter has finally banned Trump permanently from their platform. For years, many researchers, reporters, and activists have warned that his account there was dangerous and argued that deplatforming works to decrease harm. In 2018, when Alex Jones’s InfoWars was more-or-less simultaneously banned by YouTube, Spotify, Apple, and Facebook, his following began to decline, and this fits a pattern. According to researcher Joan Donovan, “the falloff is pretty significant and they don’t gain the same amplification power they had prior to the moment they were taken off these bigger platforms.”

Because Twitter dithered for four years, and harvested lots of monthly active users in the process, the deplatforming of Trump may be too little, too late. Trump’s position at the top of that platform with 88 million users was the cigarette that set a whole forest ablaze with insurrectionists. We are all now trapped in this conflagration of white supremacy enabled by media and tech. Violent white supremacists see the opportunity in this moment to radicalize more followers, to normalize their talking points in the mainstream of political discourse, and to terrorize people who oppose them.

If we cannot address both of our very urgent problems—white supremacy and the tech/media ecosystem that accelerates it, spreads it globally, and elevates it—then we will all suffer. Many will die. If we don’t find a way to address both of these urgent problems, then the storming of the Capitol last week is just the beginning of what our world will be.

Jessie Daniels, PhD is an internationally recognized expert in internet expressions of racism. She is a Faculty Associate at the Harvard Berkman Klein Center, Research Associate at the Oxford Internet Institute of Oxford University, and a (Full) Professor at Hunter College and The Graduate Center, CUNY. Daniels was also a 2018–2019 Faculty Fellow at Data & Society.

--

--