Shagun Jhaver
Aug 9 · 4 min read

This blog post summarizes a paper on understanding fairness in content moderation from the perspectives of end-users that will be presented at the 22nd ACM Conference on Computer-Supported Cooperative Work and Social Computing (CSCW) in Austin, Texas. This paper received a Best Paper Honorable Mention Award at CSCW.

“I feel sad that my effort in making that post was for nothing, and that no one will see it and no one will reply with any help or advice.” — P254

How do users feel when their content is removed from online communities? Does it deter them from posting again? Does it change their attitude about the community? Individuals have a range of motivations for posting, and this shapes their reactions to content removal. In some cases (like P254 above), a user might really need advice. In others, a user might annoy the moderators on purpose, intending to provoke a removal. How does the level of effort made in creating content affect the way users perceive its removal, and does receiving an explanation of why content was removed matter?

To answer such questions, we conducted a survey of 907 Reddit users who experienced content removals. We used an innovative approach to automatically contact potential participants shortly after their post was removed on Reddit, so that they could easily recall the circumstances around their removal. Through this survey, we asked users how they perceived the fairness of the post removal and whether they would post again on the community. Our questions captured users’ awareness and impression of different features of Reddit moderation system such as subreddit rules and removal explanations. We also included open-ended feedback questions in this survey to understand the relative frequency of key satisfactions and frustrations of our participants with the Reddit moderation system.

Frequency of participants’ responses to various survey questions, measured in percentage.

Here are a few of our key findings:

  1. 41.8% of our respondents (n=379) reported they did not notice that their post was removed until they received our invitation message to participate in the survey. Participants who noticed their removal were more likely to consider the removal as fair.
  2. When a subreddit had rules listed in its sidebar, the moderated users were more likely to perceive the removal as fair. Surprisingly, participants who actually read the rules were less likely to consider the removal as fair. Further, the clarity of rules had positive association with user attitudes.
  3. Participants who received a message from the moderation team explaining why their post was removed were more likely to consider their removal as fair. Moreover, whether a human moderator or an automated tool provided the removal explanation did not seem to matter to the participants.
  4. Many participants whose politically charged submissions were removed without notification created their own folk theories about why the removal occurred. Some users felt that moderators on the subreddit they posted to were politically biased. Others worried that influential online communities often promote a particular worldview and that all the “dissenting voices” are removed.
  5. About a fifth of all participants accepted their removals as appropriate. These participants include users who realized their mistakes and showed an inclination to improve in the future as well as users who had a need to just vent on Reddit and did not feel bothered by the removals.

Building upon our findings, we contribute many theoretical insights as well as practical suggestions for how moderators can motivate users to become productive community members. First, we bring attention to the attributes of community guidelines that are important to end-users: their size (i.e., number of rules in the guidelines), subjectivity, reason why each rule is created, and effort needed to comply with the guidelines. Second, we highlight that although the users’ folk theories about how content removals occur may be inaccurate, these theories influence how users make sense of content moderation and how they behave on the site. Therefore, we must update our ideas about best practices in moderation systems to account for how users’ folk theories about these systems may influence their behaviors. Third, we recommend that community managers must carefully attend to the design of mechanisms that explain content removals to moderated users, scrutinizing what and how much information they reveal through these processes.

Ultimately, moderated users include many individuals who have made a deliberate effort to contribute to the community. Therefore, nurturing these users and attending to their needs can be an effective way to sustain and improve the health of online spaces.

For more details about our survey design, findings, and research implications, please check out our full paper that will be published in Proceedings of the ACM on Human-Computer Interaction (CSCW) 2019. For questions and comments about the work, please drop an email to Shagun Jhaver at sjhaver3 [at] gatech [dot] edu.

Citation:

Shagun Jhaver, Darren Scott Appling, Eric Gilbert, and Amy Bruckman. 2019. “Did You Suspect the Post Would be Removed?”: Understanding User Reactions to Content Removals on Reddit. In Proceedings of the ACM on Human-Computer Interaction, Vol. 3, CSCW, Article 192 (November 2019). ACM, New York, NY. 33 pages. https://doi.org/10.1145/3359294

ACM CSCW

Research from the ACM conference on computer-supported cooperative work and social computing

Shagun Jhaver

Written by

CS PhD Candidate @GeorgiaTech | https://shagunjhaver.com

ACM CSCW

ACM CSCW

Research from the ACM conference on computer-supported cooperative work and social computing

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade