Moderation and Sense of Community in a Youth-Oriented Online Platform: Scratch’s Governance Strategy for Addressing Harmful Speech

andres lombana-bermudez
Berkman Klein Center Collection
8 min readAug 15, 2017

Online platforms and virtual worlds have become important spaces for youth development, socialization, and learning. Children and youth are growing up in a networked communication environment in which they are leveraging digital tools for expressing their creativity, seeking information, and building relationships. They are participating in “networked publics” of different sizes and themes, where they communicate with peers and mentors, share content they create, and engage in communal activities such as playing games and exchanging information about specific topics (Ito et al. 2011; Kafai & Fields 2013; boyd 2014; Jenkins, Ito and boyd 2015).

Although engaging in online platforms and networked publics presents opportunities for learning, identity development, and networking, doing so also poses certain risks. Parents and other adults have raised concerns about the presence of harmful speech in these digital spaces, particularly cyberbullying. An extension of bullying behaviors in offline spaces, cyberbullying consists of the use of digital tools to harm others, and it has driven much of the discourse around child safety in mainstream media (Palfrey and Gasser 2008; Schrock and boyd 2011; Livingstone et al. 2014; Hinduja & Patchin 2014).

Youth-oriented online platforms have approached harmful speech in different ways. While some heavily moderated platforms try to minimize risks by limiting opportunities for creative expression (e.g., pre-written chat messages as in Kart Kingdom and Club Penguin), other platforms take a “no holds barred” approach even if it results in mean and rude content (e.g., 4chan, MemeGenerator). In between these competing perspectives is Scratch, a youth-oriented nonprofit platform launched in 2007 by the Lifelong Kindergarten research group at the MIT Media Lab. This platform allows users to create and share interactive multimedia projects and to publish text-based messages (a form of asynchronous communication) across several spaces such as projects and studio comments, and discussion forums.

Scratch Home Page [https://scratch.mit.edu]

Scratch is a successful example of how an online community can reduce the incidence of harmful speech and foster civil dialogue while at the same time scaling up, and fostering youth’s agency and creative expression.* The platform has implemented a governance strategy that combines proactive and reactive moderation (through content curation and filtering) with the cultivation of socially beneficial norms and a sense of community. This hybrid strategy has allowed Scratch to address harmful speech successfully, decreasing its incidence and prevalence. Particularly, it has allowed adult moderators and young community members to regulate uncivil behaviors such as spamming, harassment, and publishing mean, rude, and inappropriate or profane content.

Scratch Guiding Principles

Establishing clear, brief, and youth-friendly Community Guidelines has been key to cultivating a supportive and safe community. The guidelines lay out a set of core values or guiding principles that all members of the community share and follow. As one of the Scratch moderators explained to me during an interview for the Coding for All project — a collaboration between the MIT Media Lab, UC-Irvine’s DML Research Hub, and the Berkman Klein Center for Internet and Society — , the Community Guidelines are easy to “absorb” and to “own” by youth of all ages.

Scratch Community Guidelines page

The guidelines encompass six short guiding principles: (1) be respectful; (2) be constructive; (3) share; (4) keep personal info private; (5) be honest; and (6) help keep the site friendly. All new users of the platform are encouraged to read the Community Guidelines when they join, and they receive automated messages that remind them of “commenting respectfully” when they start publishing their first comments and posts. If new users try to create a comment that uses language the system detects as unconstructive or inappropriate, they get an automated message that prevents them from posting; tells them that their comment “may be mean or disrespectful” and that they need to read the Community Guidelines; and includes a reminder to “be nice.” The guidelines are available across the platform, accessible through a link that appears at the bottom of all pages.

Being respectful, keeping the site friendly, and being constructive, in particular, are core values that directly address harmful speech. Any content that breaks these values is taken down by a sophisticated moderation scheme that includes adult moderators, automated software filters, and young community members. According to the four Scratch moderators I interviewed, the most common instances of harmful speech on Scratch are spam comments, followed by mean and rude comments that are unconstructive and inappropriate (typically profanity or swearing). Although instances of hate speech and harassment are rare, if they appear they are censored right away by the moderation system. As one of the moderators explained to me, the few cases of cyberbullying that have appeared on Scratch are among users who know each other in real life and carry conflict from their school into the online platform.

Scratch’s core values promote kindness and inclusion within a platform that is diverse in terms of users’ age, ethnicity, sexual orientation, gender identity, and religion. They empower youth to engage in civil dialogue and to actively and responsibly participate in building a safe space. As the members of the community “absorb” the guiding principles, they actively engage in the dissemination of the guidelines on their own. Exercising their agency, Scratchers, as the members of the community call themselves, have designed multimedia projects that explain the Community Guidelines in creative ways (there are hundreds of projects dedicated to explaining the guidelines).

Scratchers design and share multimedia projects to explain Community Guidelines in creative ways.

Tandem Moderation: Adult Moderators + Scratchers

In order to effectively manage growth and foster a sense of community, Scratch has deployed a governance and moderation system in which both adults and youth engage in regulating, monitoring, and enforcing the Community Guidelines while leveraging different sociotechnical tools, all against the backdrop of a high degree of transparency (all user-generated content is public). As the Scratch community manager noted during one of our interviews, “moderation is done in tandem with Scratchers.”

The Scratch Team has 16 adult moderators, including one community manager and one community coordinator, all of whom actively moderate. These adults have full-time and part-time paid jobs with Scratch and moderate all the discourse that is generated on the platform. They do this with the help of automated software filters that detect harmful speech using a list of designated words and phrases, and also help from users who flag content that violates the Community Guidelines. As adult moderators review the content that has been flagged, they evaluate if it violates the Community Guidelines. If it does, the content is considered harmful speech, it is removed from the platform, and the moderators send a private alert to the user who created it. Moderators also ban users who repeatedly break the guidelines and, in rare cases, communicate with parents via email in order to restore an account that has been blocked.

Scratch Team & Friends by ceebee

Young members of the community contribute to the moderation scheme by flagging inappropriate content that is published on the platform. From project and studio comments to forum posts, all spaces where content is shared have buttons that Scratchers can use for reporting, and approximately 200–300 user reports are generated daily. Moreover, Scratchers also engage in moderation by using the Community Guidelines as tools for civil dialogue, referring to specific principles when commenting in peers’ projects and studios, and citing guidelines when posting in discussion forums.

Conclusion

Scratch is a successful example of how governance strategies can foster safe, positive, and diverse youth-oriented online platforms, and reduce the incidence of harmful speech. The implementation of a hybrid strategy that combines active content curation and filtering with the cultivation of a sense of community has proven to be highly effective. Specifically, Scratch has thrived through two key dimensions: 1) establishing a moderation scheme in which both adult moderators and community members actively monitor the platform with the help of automated software filters, and 2) supporting community engagement through the adoption and championing of clear core values. This combined approach has been highly successful in reducing the incidence of harmful speech while simultaneously supporting youth agency, freedom of expression, learning, and creativity.

*Today, 10 years after its launch, Scratch has grown to a user base of 17 million and is home to an enormous amount of text-based and multimedia content generated by children. The platform has been translated into over 40 languages and its users come from all around the world and are mostly between 8 and 16 years old. According to the Scratch Stats page, during the month of February 2017, there were 822,667 New Projects, and 3,039,859 new comments. Since 2007, members of the community (17,777,432 registered users in total) have shared 21,594,894 projects, and posted 110,195,916 comments.

This essay is part of the Berkman Klein “Perspectives on Harmful Speech Online” collection. To read the full collection visit cyber.harvard.edu/publications/2017/08/harmfulspeech

References

boyd, d. (2014). It’s Complicated : The social lives of networked teens. New Haven: Yale University Press.

Hinduja, S., & Patchin, J. W. (2014, December 23). What is Cyberbullying? Retrieved from http://cyberbullying.org/what-is-cyberbullying

Ito, M., Baumer, S., Bittanti, M., boyd, d., Cody, R., Herr-Stephenson, B., Horst, H. A., Lange, P. G., Mahendran, D., Martinez, K. Z., Pascoe, C. J., Perkel, D., Robinson, L. Sims, C., & Tripp, L. (2010). Hanging Out, Messing Around, and Geeking Out. Cambridge, MA: The MIT Press.

Jenkins, H., Ito, M., & Boyd, D. (2015). Participatory culture in a networked era: A conversation on youth, learning, commerce, and politics. Malden, MA: Polity

Kafai, Y. B., & Fields, D. A. (2013) Connected Play: Tweens in a Virtual World. Cambridge,. MA: MIT Press.

Livingstone, Sonia, Mascheroni, Giovanna, Ólafsson, Kjartan and Haddon, Leslie (2014) Children’s online risks and opportunities: comparative findings from EU Kids Online and Net Children Go Mobile. EU Kids Online, LSE, London, UK.

Palfrey, J., & Gasser, U. (2008). Born digital: Understanding the rst generation of digital natives. New York, NY: Basic Books.

Schrock, Andrew and danah boyd (2011). “Problematic Youth Interaction Online: Solicitation, Harassment, and Cyberbullying.” In Computer-Mediated Communication in Personal Relationships (Eds. Kevin B. Wright & Lynn M. Webb). New York: Peter Lang.

--

--

andres lombana-bermudez
Berkman Klein Center Collection

designer/researcher/bricoleur | assistant professor of communication PUJ | associate researcher ISUR | faculty associate BKC for internet and society