Gaming companies must join the fight to keep kids safe online

ChildFund
ChildFund International
4 min readOct 24, 2022

By Kirsten Mettler

Since the early days of Pong, Nintendo and Atari, video games have been a ubiquitous part of childhood for millions of kids. Gaming can be a fun, social activity and has been shown to help young people develop cognitive skills, grow problem-solving abilities and improve emotional health. That’s important since one in five gamers is under the age of 18 in the U.S. However, there is also a dark side. Video games can provide perpetrators with an easy avenue to commit online sexual exploitation and abuse of children (OSEAC), making the online gaming world a potentially dangerous place for kids.

In 2021, the National Center for Missing and Exploited Children (NCMEC) received more than 29.3 million global reports of suspected OSEAC to its CyberTipline. OSEAC refers to all sexually exploitative and abusive acts committed against a child that at one point connect to the internet. This includes the online production and distribution of visuals depicting child sexual abuse, livestreams of abuse, grooming and sextortion. OSEAC can drastically impact children’s development and wellbeing, harming their physical and mental health, education and social outcomes.

COVID-19 only exacerbated this problem. Stuck in isolation, children looked for remote ways to play and connect with peers and gaming exploded. A sample of 14 countries found that 73% of kids 8–11 and 71% of those 12–15 reported they play video games daily. Global video game revenue surged by approximately 20% in 2020, making it a bigger market than the film and the North American sports industries combined.

Unfortunately, reports of OSEAC similarly skyrocketed during the pandemic, increasing by 73% between 2019 and 2021. While data on the rate of OSEAC committed via gaming platforms is scarce, even the little information available is concerning. In the last six months of 2021, Discord, a common communication platform for gamers, received 44,390 reports related to child safety and disabled 1,293,124 accounts. OSEAC cases are pervasive throughout the industry and have been identified on Fortnite, Minecraft, League of Legends, Pokemon Go, Roblox, and Twitch.

Gaming environments, as structured, include features that can facilitate OSEAC. As games become increasingly social, through in-game text and audio chats or through off-platform communities like Discord and Twitch, they create a clear channel for perpetrators to engage anonymously with children. Livestreaming, now common in the gaming world, provides new means for OSEAC, but its real-time nature makes it more difficult for caregivers to identify the abuse and intervene. Trading of in-game currency gives perpetrators further tools to manipulate and groom children. And with the launch of the Metaverse, despite current protections to prevent harassment and abuse, there are likely unknown risks that will present themselves.

A robust, holistic and multi-sectoral response that considers the needs of children is necessary to address OSEAC, including OSEAC in gaming. This includes interventions — from access to effective, inclusive prevention-focused education and online safety tools to provision of victim-centered, trauma-informed justice and counseling services to survivors — that better address OSEAC’s core causes and make gaming environments safer for all children. Gaming companies cannot be relied upon to self-regulate, but current policies to ensure safety are a patchwork. ChildFund’s analysis of existing policies outlines current U.S. government efforts and recommendations to prevent abuse, increase support to survivors and allow better access to justice.

Gaming companies have a crucial role to play, too. Simple platform choices that prioritize age-appropriate design, like limiting location sharing and defaulting to strict privacy settings, can help prevent children’s abuse and exploitation via video games. For example, gaming companies should make OSEAC reporting mechanisms easily understandable for children and their families. Filling out complaint reports can often be confusing, time-consuming and retraumatizing for victims of OSEAC. Companies can make simple changes — like writing important reporting instructions in understandable, child-friendly language and highlighting key details — to make OSEAC reporting easier and minimize harm to survivors. By using child-centered design that prioritizes children’s well-being, companies can better protect their young users, as well as more quickly identify and respond to threats to their safety.

Gaming companies should also regularly release simple to understand transparency reports that are easily accessible to all users, including children and their families. These reports should include clear figures on how OSEAC manifests on their platforms, how they respond to these vulnerabilities and complaints, including how often they suspend accounts and remove content, along with clear community guidelines. This is critical for greater platform accountability and improved data on gaming-related OSEAC. But, more importantly, these reports will allow children and their loved ones to make informed choices about the platforms they choose to use, as well as their safety and rights.

Congress can also improve OSEAC reporting. Currently, once online platforms, including gaming companies, detect or receive complaints of suspected OSEAC, they are required to submit reports to NCMEC’s CyberTipline and retain the included files and data for 90 days. However, over-stretched law enforcement agencies are often unable to start their investigations within this short timeframe, resulting in critical data being lost. The End Network Distribution (END) Child Exploitation Act (S.365/H.R.1198) expands the required data retention period from 90 days to 180, giving law enforcement more time to utilize this crucial information to pursue justice and identify survivors.

Children deserve the right to safely enjoy gaming. As children continue to spend more time on these platforms, concerted and holistic efforts must be made by stakeholders to keep them safe, online and off.

Kirsten Mettler is a senior at Stanford University studying Political Science and Feminist, Gender, and Sexuality studies, and has interned with ChildFund, as well as several other nonprofit, for-profit, and academic entities on issues related to telecommunications, sexual violence and children’s issues. Kirsten has published work on gender-based violence in gaming more broadly and presented on the subject at RightsCon 2022.­­­­­­

--

--

ChildFund
ChildFund International

Hi, we’re ChildFund — connecting children in need to people who care since 1938. No one can save the 🌍, but you can help a child change hers. www.childfund.org