Protecting Our Children While Respecting Their Privacy Online

Google, Facebook, Instagram and Snapchat collectively represent a previously unimaginable concentration of communication power that has brought much good to the world. Their platforms enable a pair of long-lost cousins to connect, chronically ill people to join shared communities, transgender teens to find love and support from like-minded and sympathetic people around the world.

Facebook reportedly offered to buy Snapchat for $3 Billion, recognizing the value of its young users.

But those same platforms have a dark side, introducing a cascade of risks, dangers and threats to social media’s youngest users.

There’s one major force we can use to protect our children online, while allowing them to maintain their privacy from ever-prying parental eyes.

It’s the data.

The same data that enables these social media companies to derive the insights that allow them to monetize their traffic can serve to protect the millions of kids who use their platforms.

It goes without saying that parents, lawmakers and social media enterprises all have a common mission to protect the children who use these media. And it’s a mission that’s achievable, today. Think about it. Never before have so many children been poised for protection from bullying, sexting, eating disorders, self-harm, depression — the sad, sad litany of what too many kids are going through. Whether they initiate it or are victims of it.

Over the past few years a number of companies like VISR, as well as social welfare and academic organizations, have been making good use of natural language processing and artificial intelligence to identify what’s going on in the social lives of kids. There now exists an unprecedented opportunity for early diagnosis of emotional problems, through deep analysis of the patterns, perspectives and states of mind that the social media data reveals.

VISR warns parents of 22 different alerts, including bullying, sexting, possible drug use, and mental health concerns.

Our analysis engine reviews likes, posts, conversations and comments. This creates an early warning system of an exquisite nature.

Parents and children together give access — and thousands have been thrilled to do so — and the analysis takes over from there. Parents have been notified of thousands of instances of bullying, and in certain cases presented with indicators of self-harm or worse.

Unfortunately the social media giants are moving backward, not ahead, in efforts to help protect the safety of children. In a potentially tragic misapplication of privacy rights by the leading social media companies, outside enterprises are increasingly being shut off from the potential of this data.

At the beginning of June, for example, Instagram instituted a dramatic change to its API, severely limiting the amount of data users can allow third parties to access on their behalf. This is part of a larger industry shift that has been happening over the last few years. Snapchat or Whatsapp have never been available for voluntary parental monitoring, Google remains the most open, while Facebook and Instagram now have roughly same policies.

Unless the leaders of these companies intervene, organizations can’t continue to serve children and their parents.

Having control over data shouldn’t just mean defining who can’t access it, but who can access it — especially when serving the best interest of our children’s safety.

As a result of these new ”privacy” practices, parents ironically are being forced to read each and every like, post or commentary across their kids’ social media platforms. Parents don’t want to be so aggressively intrusive. It’s a violation. They want technology that will review the data using sophisticated artificial intelligence and alert them only when there is an issue. Parents don’t want to read every homework assignment; they want a report card.

This is not someone else’s fight. These companies are deeply engaged in today’s major societal issues — whether it’s social anxieties or relentless bullying.

Here’s how we think about it: GM and Ford are not responsible for drunk drivers or dangerous roads. But in 1968, the government believed that seat belts should be required to protect passengers. Our kids are passengers on their platforms. They have the same responsibility to protect them that the auto industry has.

Today’s seat belt is data.

When we access the data they hold — our children’s data — we can unlock a world of problem-finding, risk-mitigation, and essential parental intervention.

VISR is not alone in this quest. Companies like Ginger.io analyze personal device data to help clinically depressed people. Quartet Health uses data science to help doctors figure out if a patient is at risk for another health issue.

So, in particular, we ask the leaders of these social media enterprises — Mark Zuckerberg, Kevin Systrom, Larry Page and Evan Spiegel — are you living up to your obligations by allowing advertisers to target with lapidary precision, while forcing us to leave parents in the dark?

Today’s technology CEOs are sophisticated enough to make distinctions about what data should be accessible and to whom. We need to move beyond a binary distinction of data accessibility. You have a social responsibility to make our data accessible to us.

Somewhere, right now, a kid is suffering because these social media companies have shut the door on their data. As a result, somewhere, right now, a child is in trouble and her or his parent can’t help.

You call yourselves social utilities. Please leave the lights on.

Robert S. Reichmann is founder and CEO of VISR, which provides a preventive wellness app designed to safeguard children online.