Big Tech’s Extremism Problem

Rioters breaching the US Capitol Building on January 6th

On January 6th the world watched on in horror as a group of extremists stormed the US Capitol Building, breaching it for the first time in over two centuries.

For weeks beforehand there were signs online of discontent that Trump lost the election (or didn’t — as many of them believe) and of the planning of this event to “Stop the Steal.”

As Floridi discusses, the re-ontologizing of the information society has caused a “substantial erosion of the right to ignore” and “claim ignorance when confronted by easily predictable events.” And yet in the face of this obvious threat Big Tech companies and the US government did the same: nothing.

As Floridi further mentions, soon the “distinction between online and offline will become blurred and then disappear.” This wasn’t the first instance of online extremism spilling over into real life and, despite Twitter, Facebook, and other companies banning Trump and thousands of other accounts and pages linked to violence, it won’t be the last.

All of the large social media companies have willingly housed increasingly extremist talk over the last several years. Ideas like those of QAnon, a crazy far-right conspiracy theory, were born and have since flourished on online chat boards, Twitter feeds, and Facebook groups. The ability to espouse such views on popular sites under the guise of “free speech” normalizes and legitimizes these views to their believers. Today, thousands of Americans believe at least some of the tenets of QAnon, and many of the Capitol rioters could be seen waving ‘Q’ flags.

A QAnon flag at the US Capitol riot

At some point, tech companies are going to have to take more drastic action that just removing tweets, pages, or accounts spewing hate and violence.

The algorithm’s tech companies use to serve content are designed to keep the user’s attention by prompting strong emotions. This means someone who sees a post with one extreme view and lingers on it will only be served more posts about the same topic, leading people down the online rabbit hole. Similar ads algorithms mean that someone in a “Stop the Steal” Facebook group filled with election misinformation may be served an ad for combat gear (1).

Big tech needs to stop tiptoeing around the fact that extremist views are not just harbored on their platforms but are inflamed on them. It’s time for them to finally fundamentally review their purpose and place in society.

  1. Mac, Ryan. “Facebook Is Showing Military Gear Ads Next To Insurrection Posts.” BuzzFeed News, 20 Jan. 2021, www.buzzfeednews.com/article/ryanmac/facebook-profits-military-gear-ads-capitol-riot.

--

--