Can AI-powered chatbots create a safe space while helping deconstruct rape myth acceptance?

Anjali Chandrashekar
Companion Chat
Published in
3 min readAug 31, 2017


Introducing Companion Chat, a chatbot for sexual assault victims

“I never told anyone what happened.”

“I did not press charges or file a police report, so these costs simply reflect what I have spent in the aftermath of grieving, processing, and working through. Legal costs can be outrageous for many survivors, but I didn’t pursue that process.”

“I only did this for six months. I don’t like to talk about personal things with strangers, and I found therapy really, really hard.”

These are just a few excerpts that we pulled from a quick search on Medium looking at first hand accounts and writing from survivors of sexual assault and harassment. Earlier this summer I met with a friend and developer, Adi Sidapara, to talk about our shared interests at the intersection of design, technology and social innovation. Adi, who is currently a high school senior, was frustrated with the stigmas around sexual assault education and awareness, particularly starting out so early.

We started to ask questions. Why aren’t people reporting harassment and assault? Who is responsible for educating victims on their conditions? How accessible is sexual assault education?

After doing a little research, we found a study from Marywood University which explained that police officers’ inability/lack of training in interviewing sexual assault victims indicate a strong correlation between rape myth acceptance and interviewing skill. This highlighted how even the “first responders” to many sexual harassment and assault crimes are not equipped with the skills necessary to interact with victims.

We determined that there are two main scenarios in which such events go unreported:

  1. Victims are aware of their assault and scared to report it as they fear human-bias and re-victimization
  2. Victims aren’t aware of the severity of their situation and aren’t sure about the kind of action they could be taking in these scenarios.

In order to address the issue of human bias, we are creating Companion Chat, a chatbot that anonymously screens victims of their conditions to guide them to the necessary trauma-coping and law enforcement resources. The system uses a combination of keyword extraction and summarization algorithms to determine the appropriate routes of action. The system is currently built using the Messenger SDK for Facebook, Node JS, and MongoDB, but we hope to partner and build for other platforms to ensure wide accessibility. Overall, we will open-source the data sets derived from the anonymized results for their extensive research potential in social sciences and law enforcement.

Taking an anonymized approach for educating sexual assault victims and guiding them to the appropriate resources is necessary because it removes the fear of human judgement. Mindshare UK and Goldsmiths University found that people are inclined to trust chatbots with sensitive information and that they are open to receiving advice from these AI services. By building an anonymous route of seeking help, we hope to make sexual assault education more accessible.

We are looking for like minded people in software engineering, interaction design, psychology, and social growth hacking to join our team or advise us. Our team is founded on collaboration, and we hope that together, we can increase sexual assault education and awareness and destigmatize victims. Reach out if you would like to talk to us at



Anjali Chandrashekar
Companion Chat

Designer, strategist, and visual storyteller making experiences at the intersection of design, technology, and social innovation.