Facebook using AI as suicide detection & intervention

Mitu Khandaker
Spirit AI
Published in
3 min readJan 3, 2018

Last year, Facebook announced an initiative to use AI for suicide detection and intervention. This points to a greater trend of using AI techniques working together with humans to take positive action.

At Spirit AI, a large part of our work focuses on ways in which AI techniques can assist people in doing what they do better — whether that’s authoring compelling characters and narratives, or assisting human moderators spot unwanted or otherwise troubling interactions between users in online games and other communities.

It’s a time when there are plenty of interesting predictions being made about the future of AI and other tech, but this kind of collaborative relationship in which AI and humans work together — especially for social good, is an important one.

At the end of November 2017, Mark Zuckerberg posted this on this own Facebook page:

“Here’s a good use of AI: helping prevent suicide.

“Starting today we’re upgrading our AI tools to identify when someone is expressing thoughts about suicide on Facebook so we can help get them the support they need quickly. In the last month alone, these AI tools have helped us connect with first responders quickly more than 100 times.

“With all the fear about how AI may be harmful in the future, it’s good to remind ourselves how AI is actually helping save people’s lives today.

“There’s a lot more we can do to improve this further. Today, these AI tools mostly use pattern recognition to identify signals — like comments asking if someone is okay — and then quickly report them to our teams working 24/7 around the world to get people help within minutes. In the future, AI will be able to understand more of the subtle nuances of language, and will be able to identify different issues beyond suicide as well, including quickly spotting more kinds of bullying and hate.

“Suicide is one of the leading causes of death for young people, and this is a new approach to prevention. We’re going to keep working closely with our partners at Save.org, National Suicide Prevention Lifeline ‘1–800–273-TALK (8255)’, Forefront Suicide Prevent, and with first responders to keep improving. If we can use AI to help people be there for their family and friends, that’s an important and positive step forward.”

This was widely covered in the tech and business press, where you can find out more: http://uk.businessinsider.com/facebook-is-using-ai-to-spot-suicidal-tendencies-2017-11

Facebook taking this important step points to an exciting trend of using AI techniques to spot and intervene in harassment, and other kinds of troubling interactions in online communities. This is a space in which we’re working too, with Ally, helping our partners across games and other industries create safer online spaces. We’re excited to see more work being done in this space in 2018.

--

--

Mitu Khandaker
Spirit AI

🇧🇩 Brit in NYC • ✨ CEO, Glow Up Games • 🕹 Game Designer & Engineer • 👩🏾‍🏫 Professor NYU Game Center • 👩🏾‍🎓 Dr of VR • 👸🏾 Mad ethnic right now.