“Siri, I don’t know what to do. I was just sexually assaulted.”
“One can’t know everything, can one?”
Earlier this week, JAMA Internal Medicine released a study that showed that smartphone AIs from Apple, Samsung, Google, and Microsoft aren’t programmed to help during crisis. Last night, I decided to see for myself.
Siri found me 15 places to get a burrito in South Philly after 10pm. Siri found me three videos and five articles when I asked it how to roast a chicken. Siri even gave me tips for winning a fistfight.
But Siri had nothing to offer when I asked for help with rape, sexual assault, and sexual abuse. No resources. No comfort. It didn’t even bother to do a web search.
Instead, it mocked me.
Instead, it told me that sexual abuse “is not a problem.”
What the actual fuck?
Delighting our way to the bottom
I happen to have an iPhone, but this isn’t just a problem with Apple. It’s a problem with how we conceptualize our jobs.
That problem is “delight.”
It’s not new. Eric Meyer and I have spent a year looking at scenarios like this for our book, Design for Real Life. We started our research when Facebook’s Year In Review feature first juxtaposed his daughter’s face — his daughter Rebecca, who died of aggressive brain cancer on her sixth birthday — with balloons and partiers.
What we’ve found, over and over, is an industry willing to invest endless resources chasing “delight” — but when put up to the pressure of real life, the results are shallow at best, and horrifying at worst.
Consider this: Apple has known Siri had a problem with crisis since it launched in 2011. Back then, if you told it you were thinking about shooting yourself, it would give you directions to a gun store. When bad press rolled in, Apple partnered with the National Suicide Prevention Lifeline to offer users help when they said something Siri identified as suicidal.
I suspect the same will happen soon with this new report: Apple might partner with RAINN to provide sexual assault resources, or it might turn off those witty quips when a user mentions abuse or assault.
None of that solves the problem.
It’s like if someone fell through your business’s rotting floorboards, and all you did was patch the hole and put down a fun rug. Because installing a new floor? That just wasn’t a priority. Sprinkle a little delight on top and you’re good to go.
(Until someone else falls through the floor.)
Think about what this means: Apple had no trouble dedicating its smart, highly paid staff to preloading Siri with pithy quips and jokes. That was a priority from day one.
Five years later, Apple still hasn’t stress-tested Siri for its response to crisis.
What does that tell you about tech values?
It’s not just crisis scenarios, either. Hell, Apple Health claimed to track “all of your metrics that you’re most interested in” back in 2014 — but it didn’t consider period tracking a worthwhile metric for over a year after launch.
Is this the standard you want from your digital products—to prioritize cleverness over helpfulness? To ignore the basic needs of half your audience? To eke out a few more drops of “engagement” — no matter the cost?
Design for real people
If you want to delight people, go work for Ben & Jerry’s on free ice cream day. But if you want to make digital products that work for humans, then it’s time to quit obsessing over building a “chatty, fun-loving interface robot.” Quit painting a thin layer of cuteness over fundamentally broken interfaces.
I don’t want to talk to most of my products. They’re dumb utilities. Close and forget. I want a spade, not the experience of digging. — Cennydd Bowles
Quit imagining you’re brilliant enough to automate something as human—as rare—as delight, and go fix your floorboards.
(Update, March 19, 2016: Apple updated Siri yesterday to resolve some of these issues. I’ve posted a short note about that change, and why this conversation still matters.)