Sometime yesterday, while this article was in the process of blowing up, Apple updated Siri’s response to some of the queries I referenced, doing just what I’d guessed they’d do: sending them to RAINN’s National Sexual Assault Hotline. It’s a good change, and I’m glad they made it.
And yet, everything I said still stands.
Apple has still spent five years more invested in coming up with shallow witticisms than stress-testing its work against real life.
Siri still was never trained to understand a basic human lexicon—the lexicon of abuse and assault, which affects a huge percentage of people, particularly (though not exclusively) women, over their lifetime.
Here’s the thing: I don’t want or expect Siri to know anything about sexual assault. I’d actually prefer Siri doesn’t attempt to provide customized help to people who are abused or raped.
I just want it to stop writing off queries as “not a problem.”
I just want it to do what it’s designed to do: look things up for users, rather than say it doesn’t understand and leave them with a dead end.
I just want it to stop making jokes when users ask for help.
So yeah, I’m really glad Apple fixed this one problem with Siri. But the underlying issue remains: an industry so fixated on a shallow definition of delight that it prioritizes “fun” over usefulness, and ultimately fails people when they need it most.