Update: Siri now sends rape queries to RAINN
Sara Wachter-Boettcher
25915

I am going to think in text here for a minute…..hope you are OK with that.

As tech gets more human, do we require it to …..be more human? Does that make sense to you? Poorly worded I grant you.

I think humans when faced with a human like interface, voice being just the start, expect a higher level of humanity. I would love to see if anyone has tested expectations of tech….but I digress.

I might be showing my age or my frustration with voice interaction as a whole but I would never turn to Siri/cortana for emergency help situations. However certainly just a generation or two younger than me I bet you it is much more likely. It may even be that having a non-human is better. You certainly aren’t showing weakness to anyone, you aren’t worrying about confiding in someone…there is almost an expectation of privacy between the user and this voice AI. In many ways….having a non-human to confide in might really be a good first responder in these kind of complex situations.

How do we get an acceptable response for machines to have to human trouble?

Not an easy answer. Maybe even a lifetime of work. What do we expect from AI? We have to limit it because this is a case for scope creep for sure from a software point of view. We might just ask for no comedic responses, but we may also come to the conclusion that the AI is a Mandatory Reporter. Two very opposite sides of the discussion and I would never presume to pick a point between them because I am no expert. So how do we choose what AI is expected to do? Should it be regulated? and by whom?

Show your support

Clapping shows how much you appreciated Dan’s story.