Balancing Education, Design, and Parental Control in Kids’ Social Media Use
Facebook made a splash this week with the introduction of Messenger Kids, a version of its popular Messenger app meant for the under-13 set. The app allows kids to send messages — including gifs, videos, and decorated photos- to a parent-approved set of contacts. The app stands out from the products of other big-name social media platforms in that it’s designed specifically for younger kids, something that few other companies have tried due to privacy restrictions and laws that specifically require parental consent for users under 13. However, Facebook leverages the magnitude of its existing parental user base to make consent easy. Parents must use their own Facebook accounts to create Messenger Kids accounts for their children, and they can set parental controls entirely through their own Facebook accounts.
As with every technology targeted at children, Messenger Kids has immediately earned both praise and condemnation from parents, researchers, consumer advocates, and others with stakes in the increasingly-combined worlds of parenting and technology. Many parents, including some quoted in a recent New York Times article, are resigned to the inevitability of kids’ technology use, and appreciate that Messenger Kids allows them at least some control over that use. They’re also excited about the app’s many family-friendly uses, like chatting with grandparents or keeping in touch with mom when she’s away. Others are concerned that yet another kid-directed app will steal more of their kids’ time from offline play and face-to-face interaction. Still others worry that introducing kids to the Facebook brand at a young age is priming them to become unquestioning consumers of Facebook’s other products as they get older.
The Cornell Social Media Lab has begun to take an in-depth look at risk-benefit tradeoffs of technology just like this one. “The idea of a children’s messaging application with built-in parental controls is a response to today’s reality. Families live in a digitally connected world, and children are surrounded by devices or even own a device like a tablet or smartphone,” says the SML’s Director, Professor Natalie Bazarova. Facebook claims to be responding to this reality — in its introduction to Messenger Kids, it touts the balance of connectedness and control that the app offers. It reminds readers that the app was developed with the input of families, educators, researchers, and advocates, many of whom are concerned about apps that allow kids to contact or be contacted by strangers. Messenger Kids mitigates this concern by requiring parents to approve their kids’ contact lists.
However, “stranger danger” isn’t the only thing about kids’ technology use that has parents worried. “Screen time” — how long kids are using their devices, and when — is also a topic of concern for many parents. It’s not clear whether Messenger Kids lets parents control these aspects of the app. “Parents should closely monitor and regulate children’s use of a device and their time spent on it so that it does not displace other developmentally appropriate activities such as outdoor play, exercise time, or face-to-face interactions with peers,” says Bazarova. She recommends that parents follow the recently-updated guidelines for children’s media use from the American Academy of Pediatrics, which advises families to develop a family media use plan that balances online fun with healthy development. According to Bazarova, “Facebook can implement built-in mechanisms in the app itself, like a turn-off timer, that support a healthy media diet and regulate time spent on devices.”
In addition to app designs and family guidelines, the Social Media Lab has been looking at new ways to educate young people about the risks and benefits of social media. While kids often learn about issues like media literacy or appropriate online behavior in school, they often have no way to practice these skills before being let loose into online spaces populated by adults and adult-oriented content. To help fill this education gap, the Lab is developing a new tool called Social Media TestDrive. TestDrive acts as a social media simulator, allowing kids to practice managing their privacy, making appropriate posts, and communicating with others. However, instead of interacting with potentially unknown others on the open Internet, kids instead interact with “bots”, or pre-programmed, simulated “users” that act and respond in realistic ways. Each TestDrive user is also separated into their own instance of the site, so their actions can’t follow them into the “real world”. Like a novice driver in a driving simulator, TestDrive lets kids explore, make mistakes, and practice their new-found skills in a way that’s safe and empowering. The Lab is also partnering with educators and advocacy groups like PRYDE to help deploy TestDrive to classrooms, camps, and after-school programs that can benefit from it.
Education is one way to deal with the extensively-documented tradeoffs between the risks and opportunities of Internet and social media use for young people. Appropriate risk management strategies may differ from child to child, and for the same child at different developmental stages. Even Facebook itself acknowledges the delicate risk-opportunity balance, saying that it intends to work with families, academics, and other experts to create tools that are safe and beneficial for kids. The Social Media Lab is continuing to work toward understanding these trade-offs, leading to better recommendations for families and to more advanced educational tools.