The Importance of Social Media Moderation: Interview with iKeepSafe.Org President Marsali Hancock

This interview is part of the After School Social Change with Technology Interview Series. This article features President and CEO Marsali Hancock. Read Part 1 of the Series with Marsali Hancock here.

Social networks play a large role in the lives of many people, and possibly none more than today’s teenagers. Children 8 to 18 spend an average of 44.5 hours per week in front of screens. “For teens, this (social media) is how they connect emotionally,” says Marsali Hancock, President and CEO of, an international nonprofit organization dedicated to tracking the effect internet-connected devices have on children.

Each network is created for a specific audience and purpose, but as with any product or service, there are clear boundaries of what is accepted as normal use, and what is classified as misuse.

“It’s on every platform (misuse)…but if we can build in moderation and use the terms of agreement to make it easy and empowering to report and when people report it’s helpful — it makes a big difference,” says Marsali. These terms of service, community guidelines, and proactive moderation are necessary support features of a social network. There has to be guidelines in place and support staff that can ensure community members are protected and misuse is prevented. Marsali says, “If the social network takes the stance of making a place where people feel connected and supported, then having monitoring on the backend does two things: identifies people who are victimized, and identifies the people who are using the network for radicalization.”

After School App uses three levels of moderation on top of community guidelines and additional support. The first level is automatic detection, which utilizes keywords and filtering technology to immediately catch the vast majority of posts that violate the terms of service before it would be seen by others. The second level of moderation is manual — every post that passes through the first level of moderation is viewed by a human to confirm it meets posting requirements and abides by community guidelines. The third level allows each member of a community to report a post they deem to be inappropriate. One report removes the post from the feed, giving power and responsibility to each user.

“When your user guidelines and terms of service are clear, your community often self-moderates and self-reports because they want to be there and look out for each other,” says Marsali. After School strives to provide the safest social network available and will continue to take a proactive and harsh stance against cyberbullying and misuse of the app.

Also Read: Moderation and Cyberbullying Protection on the After School App

About iKeepSafe

iKeepSafe believes that there are six pillars of success for online digital citizenship: Balance, Ethics, Privacy, Reputation, Relationships, and Online Security. Among their resources, iKeepSafe provides guides and information on how parents can prevent and detect potential risks, including their BEaPRO™ Parent program. For more information on how to help influence the positive and safe use of technology, visit

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.