What assistance is a virtual assistant going to give you in a crisis?
Sara Wachter-Boettcher’s interesting Medium article, “Dear tech, you suck at delight”, highlights the deficiencies of digital assistants when it comes to dealing with crises. Instead of working out ways to really help people, it seems that most companies opt for “funny” or “smart” comments that in certain situations are utterly inappropriate.
Dear Tech, You Suck at Delight
“Siri, I don’t know what to do. I was just sexually assaulted.”
Needless to say, programming a virtual assistant is a complex task. How is it supposed to respond when somebody says they want to shoot themselves or jump off a bridge, by telling them where the nearest gun store or viaduct is? After the inevitable media frenzy, Apple has reached an agreement with the US National Center for the Prevention of Suicide that suggests the user make that call. But there are plenty of other situations, such as when somebody might be being held up or raped when virtual assistants are still clearly not getting it right and responding with humorous comments.
Why do digital assistants respond in this way, why are they unable to offer a solution in crisis situations, such as providing emergency service numbers, taking the exact location, switching to record mode, rather than assuming the user is joking? Tech companies need to understand that some users will consult their digital assistant in a crisis and be able to program virtual assistants to respond adequately.
One of the defining characteristics of humans is the ability to understand the difference between a comment that includes, for example, the word rape, and somebody asking for help after they have been sexually assaulted.
Given the current levels of technology, don’t we have a right to expect a better answer from Siri if we tell it (her?) that we’re being followed and are afraid than, “the only thing we should fear is fear itself”?
It may be that the tech companies’ attitude is that your virtual assistant is not there for real crises, and is simply there to help you find a restaurant or to explain the origins of the Seven Years War. If you have a real problem, call the cops. It may also be that the technology is still not sufficiently well developed to detect a full-blown emergency. Either way, I’ve had enough of Siri’s “humorous” comments.
The company that really wants to set itself apart from the competition could do so by creating a virtual assistant able to be of use in a crisis. For the moment, nobody seems willing to step up to the plate, which is, to say the least, a pity.
(En español, aquí)