Building a Safe Digital Space for Young Makers and Learners: The Case of is an online platform where youth (ages 6–16) can learn, discover, and share a wide range of skills. From baking to computer programming to astronomy and athletics, on, youth can deeply explore of their personal interests and hobbies. The platform provides a digital space where young people can engage in multiple project-based activities, complete learning challenges, and interact with peers by commenting and liking the projects they share.

Combining features of social media platforms — such user profiles, followers, favorites, and hashtags — with the mechanics of online learning sites,such as challenges, badges, pathways, and portfolios, provides a safe environment where youth learn a rich variety of skills, build digital portfolios, and connect with like-minded peers. As youth participate on the platform, they become part of a vibrant community of makers. As Kelsey Holtaway, one of the moderators explained to me, for many users the platform represents one of their first forays into learning how to interact with others online:

“For a lot of kids, this is their first step into social media. Learning from the get-go that there are humans on the other side of the screen leads to the kids learning about digital citizenship and being kind to each other.”


In this case study, which is part of a larger effort that analyzes three different youth-oriented online learning platforms (Connected Camps,, and Scratch), we showcase how has been exemplary in revealing how safe spaces are created and sustained and describe the platform’s governance structure and moderation scheme. The case study is based on our own observations of the platform, as well as three semi-structured interviews conducted via telephone and email with three adult moderators (including the community manager) between October 2016 and May 2017.

The case study is part of the Coding for All project — a collaboration between the Youth and Media project at the Berkman Klein Center for Internet & Society at Harvard University, the MIT Media Lab, and the Digital Media and Learning Hub at University of California-Irvine, with support of the National Science Foundation.

A Youth-Oriented Platform and Online Community

Founded in 2012 by four American young adult entrepreneurs with experience in Internet video platforms, filmmaking, and youth education (Zach Klein, Isaiah Saxon, Daren Rabinovitch, Andrew Sliwinski), has rapidly become home to a thriving community of youth passionate about making, learning, and discovering. According to Community Manager Becky Margraf, the platform currently has more than half a million users (40% boys and 60% girls). Members of the community are mainly from the U.S (approximately 75%), Australia, Europe, and Japan, and primarily communicate in English.

Today, hosts more than 1 million projects and offers 130 different skill patches organized in 12 categories such as science, design, hacking, business, and athletics. These patches are visual representations of the skills youth gain by completing specific project-based activities (termed “challenges” on carefully curated and designed by staff and other experts, including people from NASA, Mojang, and the LEGO Future Lab. For example, the “Sensor Hacker” patch encompasses more than 10 challenges, including, “make a robot follow a line,” “construct a pressure sensor,” and “build a musical theremin.” To successfully complete a challenge, youth publish posts that include visual documentation of their projects. All members can view, comment, and like the posts, and a dedicated team of adult moderators assess the projects to determine if the challenge has been completed. After successfully finishing three challenges, youth can earn a digital skill patch that is then added to their portfolio.

Governance Approaches for Fostering a Safe, Positive, and Inclusive Space

Even with its massive growth — both in the number of active members and the amount of user-generated content being shared — has been able to create and maintain a safe and positive environment where the risks of encountering inappropriate behavior and harmful speech (e.g. cyberbullying) have been minimized. In order to mitigate and address this type of conduct, the platform has combined several governance approaches for establishing and enforcing a set of shared principles and norms, and regulating and shaping the evolution of Adult moderators oversee all the content published on the platform with the help of automated word filters, while active community members flag objectionable content. Additionally, parents and teachers, who have read-only accounts linked to those of their children and students, respectively, can report inappropriate comments and posts, and remove project comments made by their children/students.

Moreover, has implemented a governance approach of pro-active modeling in order to foster positive behavior among members of the community. Adult moderators are highly visible on the platform and actively model prosocial behavior, while assuming public leadership and educational roles. By doing so, they are able to publicly perform examples of best practices and positive modes of conduct. terms of service, policies (market, copyright, privacy), and community guidelines critically complement the governance approaches. Users accept and follow these policies and guidelines as they participate on the platform, while adult moderators enforce and promote them.

“Being in Tune with the Vibe:” Following the Community Guidelines

Although the Terms of Service, and policies of are written in legal jargon, and primarily reviewed by only parents and teachers, the company has also created a set of kid-friendly community guidelines. These guidelines, which represent the core values of the platform, are concise, encouraging, and easy to understand. The community guidelines are the “basic set of principles” that youth are encouraged to follow in order to contribute to the platform, and be in tune with the “vibe” of the community. According to the Community Manager, key elements of the guidelines include “being responsible, being supportive and contributing positively.”

In total, there are 15 guidelines organized in four different categories: 1) “how to be awesome,” 2) “posting,” 3) “social,” and 4) “privacy.” Each guideline is introduced with a short phrase, and includes a brief explanation and visual image (e.g., icons, photographs, or animated GIFs). For example, the guideline “don’t be afraid to try” is represented by an animated GIF of three kids sliding down a mountain while another kid and a dog try to catch them. Below the image and the phrase, the principle is explained with the following text: “Don’t be afraid to be weird or try something new and fail. Go outside of your comfort zone and do something you’ve never done before.”

The guidelines highlight values that are central to the culture of DIY. For example, “don’t be afraid to try,” and “don’t be lazy” encourage participation and experimentation. “Don’t be a troll” and “don’t spam,” and “don’t share passwords” help establish civility and privacy, respectively. Some affirmative principles include, “share what you know,” “show how you did it,” and “give feedback,” encouraging collaboration, peer learning, participation, and openness. “Be original” and “try new things” emphasize creativity, respect the copyright of others’ work, and experimentation. “Respect privacy,” “keep personal stuff private,” and “don’t share password,” reassure privacy. And “flag inappropriate comments” and “report troublemakers” invite users to monitor the activities of the community and participate in the moderation process.

Despite numerous community guidelines, the three listed in the section “how to be awesome” (i.e., try new things, share what you know, don’t be a jerk) seem to be the most heavily relied on in terms of moderating and solving conflicts. Kelsey Holtaway, one of the moderators I interviewed, asserted that these three guidelines act as a “north star.” She elaborated: “When things go awry, sometimes there’s an influx of posts that can cause controversy. If we come back to the three guidelines, which focus on supporting kind and strong communities of creators, it can act as a north star.”

These three basic principles, with their kid-friendly language and emphasis on “awesomeness,” are also the ones that community members have most readily appropriated, using them in their own interactions and comments. As Becky explained,

“‘Don’t be a jerk’ is kind of our saying. We quote that. Kids quote that all the time. So being a troll is never cool.”

The guidelines are accessible for youth and made highly visible on the site. As such, Becky noted, “Most users are familiar with the guidelines.” Kerri Engelder, a moderator, said that long-time members, especially, play a vital role in upholding the guidelines: “Many long-time DIY users also take the initiative to guide new members of the community by explaining how to use DIY and reminding them to follow our guidelines (like being nice and sharing original work).”

A Hybrid Moderation Scheme’s governance approach relies on a hybrid moderation scheme that enforces community guidelines, curates, and filters content, and minimizes the risks of inappropriate conduct and harmful speech. Through this framework, a team of five adult moderators, including the community manager, oversees all comments and posts shared on the platform. Moderators use technical tools such as automated word filters for detecting unacceptable speech, a private internal chat software, and a flag system that collects reports submitted by users. Moreover, the moderation scheme also includes the participation of active community members (including parents and teachers with read-only accounts) who flag objectionable content, and interns who help with some of the tasks typically performed by adult moderators.

Moderating a youth-oriented platform is labor intensive. adult moderators work full time, are paid, and perform multiple tasks and roles. For example, each moderator is tasked with monitoring and curating all the posts and comments that are published on the site; enforcing community guidelines by taking down content; blocking users that repeatedly break the rules; and assisting in solving conflicts among users. They are also responsible for assessing the projects posted by community members and approving them as evidence of challenge completion. Additionally, moderators assume public leadership and educational roles through organizing community events and contests, updating skill content, and modeling positive behavior in the community. For instance, in the public space of DIYers’ projects comments, adult moderators are constantly providing positive feedback and asking questions that promote constructive conversations and knowledge exchange.

Content Curation and Filtering

In the domain of content curation and filtering, moderators monitor and address uncivil behavior on the platform. From off-topic images and videos to mean comments, from privacy to mental health concerns, moderators remove content that violates the community guidelines. Some of this content, such as mean, rude, and/or inappropriate language, is censored by automated filters. Other content is primarily addressed by moderators, who proactively monitor all published information, or community members, who can flag content through a “report” button. According to Becky, the Community Moderator, “Everything gets taken down fairly quickly. If a kid comes across something objectionable, they know to report report it and a member of our team will remove it, often within minutes.”

Parents and educators with accounts also play an important role in moderation. Their accounts are linked to one or several users and come with a unique dashboard feature for monitoring all the content that is associated with the linked users. Although these are read-only accounts, they can be used for both reporting objectionable content and removing content published by users with linked accounts. These accounts also grant permission to remove comments made by other users on their kids’ and students’ projects.

Banning and Suspending

Suspending posting privileges and banning users, however, is a task that only adult moderators can execute. When users persistently act as “jerks and trolls,” “say something vicious,” “post bullying comments,” and repeatedly break the community guidelines, users lose the right to post comments and projects. In such cases, moderators require that the user contacts them via email, and, in some instances, will also try to get in touch with the user’s parents. If, after recovering the posting rights, the user continues to misbehave, moderators can permanently block the user and delete all the accounts associated with that user.

According to the three moderators that I interviewed, permanent bans are rare and only happen when users who have been suspended from posting return to the community and continue to break the guidelines. In the majority of instances, community members learn quickly and act in a positive manner when they are permitted back on the platform. Becky explained: “They very rarely come back with a vengeance and like try to make lots of accounts or get very aggressive. Once they realize and they know they have been banned, many of them take that very seriously. They want to come back and we see a huge turnaround in their behavior.”

Steering the Community: Modeling Behavior, Cultivating Norms, Supporting Learning

One of the major tasks of moderators is to assume public leadership within the community and ensure that the community adheres to the site’s core principles and values. The five moderators are highly visible actors on the platform and active participants in this public space. Like other DIYers, moderators post their own projects and comments, and complete challenges to gain skills. They also organize contests, review and approve projects, and help connect DIYers who have similar interests. By offering constructive feedback, recommending projects, and suggesting pathways that DIYers can take after completing a challenge, moderators are key actors in supporting learning.

Among these responsibilities, modeling positive community behavior may represent the most important moderator task on DIY. As Kerri explained, “I think moderators help by setting the example, and kids tend to model their posts after ours.” Those posts and comments become examples of “how to lift up others and give feedback in constructive, non-judgemental ways.” Additionally, moderators also help to emulate community guidelines, by, for instance, engaging in creative learning. According to Becky:

“We want them to be curious. We want them to be constantly making and we want them to be exploring the world around them. So we try to do that. We try to take our medicine just to show that even like as an uncool adult, we can still embody these things.”

Expanding Opportunities for Youth Participation

In addition to DIY moderators, youth play an integral role in the moderation scheme and also participate in the platform governance. They flag projects and comments that violate the community guidelines and can contact moderators privately via email when they have concerns with other members’ posts. Moderators review emails and flagged content, investigate the context, and take-down comments and projects when they violate the community guidelines. As Becky pointed out:

“We fully investigate the context. We look at what is going on. Usually it’s very clear if the content is appropriate or not or if it’s breaking a rule. If it’s not, we follow-up with the member who submitted the report and try to get more information about that.”

Given the ease of submitting a report through the click of a button, users frequently submit accidental reports. As such, is currently working on developing reporting tools that will act as “another confirmation step to help cut back on accidental reports,” and “allow kids to give more context to what is going on.” members can also participate in moderation tasks related to organizing content and helping to identify projects that are relevant to the community. Using hashtags (#) in the project titles and comments, youth help to appropriately tag published content and identify relevant themes and skills. Moreover, as they post comments, they can also tag moderators, signaling various projects that could be featured on the front page. As Kelsey explained: “They create and use hashtags so they can sort through contests or community projects, and they tag moderators on projects they think should be featured on the site’s main page.”

Any member of the community also has the opportunity to participate in a internship program, as long as their parent gives explicit permission for them to participate. has designed two main types of online internships to incorporate community members into platform moderation and governance: focused internships, and group summer internships. The focused internships allow a DIYer to engage in a one-on-one mentorship with a staff member and focus on a self-guided project that matches both youth and platform interests. According to Becky, “These projects have ranged from design (learning to wireframe and design new feature ideas for DIY’s app), and engineering (learning to work with DIY’s API and building code to be deployed on the actual platform), to marketing, and social media.”

In contrast, the group summer internship brings together a small group of DIYers on a moderated chat channel (Slack) to engage in a range of tasks such as: (1) planning new site events (e.g., contests, daily challenges, etc.), (2) giving feedback on new platform features and (3) building their skills as community members (e.g., welcoming new members to the platform, giving other youth constructive or positive feedback about their projects, sharing their thoughts and ideas about community guidelines). Furthermore, Becky noted,

“We also gave these kids opportunities to deep-dive into their personal interests, so many of the summer interns got chances to do things like design visuals for site events, write copy for events, add their ideas to our skill curriculum.”


Our case study of reveals how a youth-oriented online platform and community has built and sustained a safe digital space where youth (ages 6–16) can pursue interest-powered learning and engage in a “participatory culture” (Jenkins et al. 2006; Jenkins, Ito, and boyd 2016). In this kind of culture, members feel that their contributions matter, support each other while creating and sharing, and experience some degree of social connection with one another. On, youth publish visual documentation of their hands-on projects, complete challenges, and give/receive constructive feedback to/from others in an environment where the risks of encountering inappropriate behavior and harmful speech have been minimized. As DIYers participate in the community, they are able to explore and gain a wide range of skills, connect with peers, and develop learning identities embracing the ethos of the Do-It-Yourself and maker movements.

Reassuring safety has played a critical role in sustaining an environment that is inclusive towards youth from different backgrounds (e.g., developmental stages, varying levels of expertise, and different interests). As Becky explained, “the platform is originally structured to be very flexible for all kinds of learners, regardless of the context, age, or level of ability.” She continued:

“We want kids especially young ones, especially shy ones, who are still trying to be comfortable . . . we want to create a place where they feel very willing to share and feel very comfortable putting themselves out there, especially if they don’t feel that way in real life.”

Establishing a governance approach in which both adults and youth can participate has been essential for creating and maintaining a safe space. Adult moderators ( staff), adult users (parents and teachers), and young community members perform complementary tasks. While adult moderators enforce community guidelines, curate content, model behavior, and review projects, young community members help identify relevant projects, and flag and report inappropriate content. Parents and teachers, despite having restricted accounts, participate by curating the content of the young users who are linked to them, and flagging objectionable material. The community guidelines — the platform’s set of core values — complement and promote this hybrid moderation scheme. The guidelines are stated in kid-friendly language and are used by both adults and youth to steer the community towards civility and growth.

Finally, by building a digital space that is safe, diverse, and inclusive, has been able to support interest-powered learning, with respect to a wide range of applied skills and vocational roles, while emphasizing the importance of communicating with others in a positive and constructive manner. Thus, as youth pursue their passions and discover new interests on, they also learn about civil dialogue. This capacity for positive online behavior is essential for learning and participating in a digitally connected world.

Berkman Klein Center Collection

Insights from the Berkman Klein community about how…

Berkman Klein Center Collection

Insights from the Berkman Klein community about how technology affects our lives (Opinions expressed reflect the beliefs of individual authors and not the Berkman Klein Center as an institution.)

andres lombana-bermudez

Written by

designer/researcher/bricoleur | assistant professor of communication PUJ | associate researcher ISUR | faculty associate BKC for internet and society

Berkman Klein Center Collection

Insights from the Berkman Klein community about how technology affects our lives (Opinions expressed reflect the beliefs of individual authors and not the Berkman Klein Center as an institution.)