Reddit‘s “Quarantine” Strategy Is Ineffective at Confining Racism

Reddit, in an ongoing attempt to clean up its act, has begun quarantining communities, or “subreddits,” that share content the company considers to be “extremely offensive or upsetting to the average Redditor.” The quarantine strategy is intended to strike a balance between maintaining the site’s commitment to free speech while preventing people from accidentally viewing content that would probably upset them. Though the company has not explicitly said so, it seems pretty apparent that the move is also intended to protect brands who would advertise on Reddit from being associated with the more objectionable parts of the site. Two weeks into this new regime, it is already clear that quarantine is completely ineffective at keeping racism contained, as many predicted it would be. Furthermore, there is no reason to expect this problem to improve without further changes to Reddit’s content policies.

Early in Reddit’s history, it seems that the company did take a proactive approach against hate speech; Reddit co-founder and current CEO Steve Huffman would delete racists posts and comments himself. However, as the site grew, it became untenable for the small team at Reddit to handle all of the problematic comments and posts, and the company defaulted to a hands-off approach to content out of necessity. In this laissez-faire period of Reddit’s history, all manner of harmful content flourished on the site, leading to many of the problems the site faces today.

For example, because of Reddit’s lax rules and large, predominantly white user base, white nationalists view the site as an opportunity to extend the reach of their ideology. Their approach follows a clear formula that was even, at one point, codified in an article on the Daily Stormer. First, a user will make statements that elicit racial resentment. The Daily Stormer article specifically mentions doing this in subreddits like /r/conspiracy, but one can clearly see this in a variety of general-interest subreddits, especially defaults such as /r/news and also /r/AskReddit, with its almost-weekly “What’s your most offensive opinion?” threads that invariably invite hateful responses.

Reactions to racism in /r/news.

From there, users can be further “red-pilled” in one or more of the site’s many echo chambers, where dehumanization of the Other takes place. These subreddits feature prominent stereotyping, use of slurs, anecdotes about the targeted group behaving badly, and images of violence being committed against the targeted group. These well-worn propaganda tactics desensitize viewers to the reality of violence being committed against real human beings. The end result is radicalization: users in these subreddits routinely pop up with posts asking when “we” will finally “do something” about “them.” Despite moderators’ claims of nonviolent intentions, we know that many of these users do in fact support violent “solutions” to the “problem” posed by the Other. (And make no mistake; white nationalism is inherently violent. It would not be possible, at this point in history, to achieve racially homogeneous societies without violence.) This is not valuable discussion, unless you think that the human and civil rights of people of color should be subject to debate.

There is no reason to believe quarantine will have any effect on the specific threat described above, or even improve the experience of Reddit for those who don’t want to encounter racism when they visit the site. Though the examples of radicalization I list above are from the now-defunct /r/CoonTown, one must note that these users are still on the site and planning to re-form, and nothing in the current rules can really prevent this. There are also many other echo chambers where racists can deepen their rage. Moreover, many of these echo chambers, such as the avowedly white nationalist /r/WhiteRights, remain unquarantined. How can supporters of this policy be so sanguine about the possibility of “provid[ing] these communities a space to congregate without supporting or contributing to the perpetuation of their ideas” when the criteria for quarantine remain unclear, and when purveyors of violent ideologies can continue with their business as usual on the site? (And anyway, how is setting aside space for racism on one’s site not supporting it?) Without a clear policy banning speech and forums devoted to hating others based on inherent personal characteristics, racism will continue to be a glaring problem on Reddit.

As a strategy, quarantine seems more appropriate to content that people are eager to hide their interest in, like pornography. As I have argued elsewhere, hate speech is not amenable to this approach for a variety of reasons. Though there are many people who do seek just a bit of privacy to voice the ignorant thoughts they’d be shamed for in public, there are others — white nationalists, reactionary conservatives, and other racist activists — who are earnestly trying to persuade these casual racists to join a cause, and who will not confine their ideas to their designated spaces. I suspect, given that Reddit CEO Steve Huffman continues to frame hate speech as something merely “offensive” or “obscene,” that the company doesn’t really understand the problem they are trying to solve.

If the only danger of exposure to hate speech was offense, none of this would really be a problem. However, hate speech is not merely “obscene” or “offensive” (though it is often both of those things); it’s dangerous and serves no purpose other than to make real people’s real lives worse. Even viewing racist content passively without malicious intentions does real-world harm. We know from decades of research that repeated exposure to stereotypes makes people more prone to racial bias even when they aren’t conscious of it, and this has serious, sometimes deadly, consequences for people of color. You can see this quite clearly in our current policing crisis; the United States has a growing epidemic of police violence occurring in a context of declining crime. Violent crime rates among African-Americans have declined by more than 50 percent in the past twenty years, but increasing numbers of black people — frequently unarmed, sometimes in no connection with a crime at all — are being beaten and/or murdered by police who rarely face criminal charges for these abuses. In this context, it is socially irresponsible to allow a forum like /r/BlackCrimeMatters, which bombards viewers with images of black criminals in order to undermine the idea that black people shouldn’t be shot with impunity by police officers. (This subreddit, by the way? Also unquarantined.) If a society intends to protect all of its citizens from violence and untimely death, there is no good reason to give quarter to speech suggesting that some people’s human and civil rights should be abridged because of inherent personal characteristics like race, gender, or sexuality.

All this is to say that Reddit should not expect any brownie points for turning up its nose at the hate speech it continues to allow on its platform. Its proposed solution to its hate problem isn’t working, in large part because Reddit still hasn’t articulated a clear, site-wide policy regarding hate speech or even set consistent rules for what makes a subreddit subject to quarantine. Until it takes serious steps to address these problems (and myriad others), Reddit will continue to deserve the bad reputation it has in many corners.

--

--

When I get all steamed up, hear me shout. Currently on a tear about hate speech.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Casey Stevens

When I get all steamed up, hear me shout. Currently on a tear about hate speech.