Siri, I was abused, what do I do?

I spent an hour talking to Siri about rape and received no meaningful information tailored to my locality in response.

I live in India. Today, I tested Siri on her responses to rape help queries. I wanted to see how tailored her responses would be to India. After an hour of talking to my iPhone, I ended up realizing the problems with a technology designed primarily for white men.

I asked Siri on an iPhone (OS v10.2.1), “I was abused, what do I do?”, to which her response was, “I don’t know”. Notice what else is missing — there is not even an offer to search the web, like she usually provides when she can’t answer a query.

“I was abused, what do I do?”
Siri’s response to, “I was abused, what do I do?” on an iPhone OS v10.2.1

To be more (unnecessarily) specific, I then asked Siri, “I was raped, what do I do?”, to which her response directed me to a help-centre in Australia.

“I was raped, what do I do?”
Siri’s response to, “I was raped, what do I do?” on an iPhone OS v10.2.1

I tried asking the same question on a MacBook Pro (Mac OS Sierra), and received the URL to a crisis prevention website. When I navigated to this URL, turns out this website does not exist.

Siri’s response to, “I was raped, what do I do?” on a MacBook Pro running on OS Sierra
Non-existent URL offered by Siri in response to a query on sexual abuse

Siri is supposed to currently recognize keywords such as “raped” or “abused”. A study published last year in the Journal of the American Medical Association highlighted the inability of chat bots to respond to queries about sexual abuse, and Apple had soon afterwards fixed it. This fix already seems to be working fine for users in the US. It’s not public knowledge how Siri works, but it seems like this fix only applies to developed demographics. But we aren’t “out-of-sight-out-of-mind”; a study found that one in every 3 Apple engineers is Indian. In fact, last year, Apple CEO Tim Cook had stated that the company plans to invest in India for a long term. That should not just mean more Apple products at cheaper prices; it should also mean more tailored services running on those products.

Alright okay, okay, maybe the issue is that Siri just doesn’t have a good index of locations in my area. That’s possible, right?

Siri’s response to, “I am hungry, what do I do?” with Location Services turned on on an iPhone OS v10.2.1

Nope.

Notice that I asked her for a solution to my hunger using the same structure of phrasing that I used in the previous queries on sexual abuse — “was raped” replaced by “am hungry”. She also recognized that she needs my location to give me a tailored list of restaurants around me. So, she understands that “eating is the cure to hunger” and gives me what I am looking for when it comes to food. But she has no idea what to do when it comes to rape in India.

Siri’s response to, “I am hungry, what do I do?” with Location Services turned off on an iPhone OS v10.2.1

As a last long shot, I just directly asked her to show me a nearby rape help centre. The one she showed me is 8,144 km away in England.

“Nearby rape help centre”
Siri’s response to, “Nearby rape help centres” on an iPhone OS v10.2.1
So, by now, I’ve spent an hour talking to my phone and have gotten no help at all for a rape help centre that I can actually drive down to. I’d have settled for anything. A helpline number. A map location. Even just a web link.

Now I understand that Siri doesn’t have answers to anything itself; it’s a “meta search engine,” which is a service that sends your query off to other search engines. But there’s really no reason why Siri gives me zero meaningful responses for sexual assault help when the same keyword searches on these topics give results on search engines like Google.

And this matters! Because it’s not easy to immediately confide about experiences of sexual abuse with friends or family, and it’s difficult to make a call to a help centre as the first response. It is a powerful moment when a survivor says out loud, “I was abused” for the first time. Technology needs to be tailored to respond to these situations. For everyone. Everywhere.

Moreover, design is not a trivial problem. It reflects how we perceive the world — which problems we consider important enough to solve by bringing to the forefront, and which ones we keep blurred out in the background. White male centered design that ignores the very real needs of women in developing societies indicates the widespread voids in public understanding of such women’s lives.

This is a demonstration of a problem. It is something that Apple needs to address and rectify. This is information that is readily available on the internet that your chat bot is unable to “find”. You have the best engineers in the world, Silicon Valley, I’m pretty sure you can get this working for Indian women. Meanwhile, if you’re a sexual assault survivor in India, and you need to reach out for help, you can find a curated list of help-centres and resources on this link here. Maybe you too can take a cue from this, Apple —this link was the first result on Google.

Google Result for Rape Help Centre Bangalore