Should we be saying Please & Thank You to Alexa?
For nearly a year now I have been asking my children to say please and thank you when asking Alexa questions, and I think I am wrong to do this.
After much Google-ing, I discovered research and articles highlighting a concern that the next generation is maybe becoming rude by not engaging in human conversational etiquette with voice assistants, with 42% of 9–16-year-olds having accessed voice recognition “gadgets” at home. There seems to be no research that proves this assumption to be true, regardless of what the media is saying (Happy to be proven wrong if anyone has any research?).
Should we be polite to our voice assistants?
Fast forward to 2050 when our robot overlords are deciding our human fate, we may look back to the early 2000s and think, maybe we should have been a bit nicer to that speaker in the kitchen playing music.
Is the English language, the problem?
In the UK, please, thank you, excuse me, even a sorry to the person who walks into you, is all commonplace amongst adults. Though other cultures this is not the case, with particular languages not finding command based requests rude, some languages do not even have a word for, please.
I monitored my behaviour over a few weeks and found that current voice assistants got me the answer I wanted when I communicated in a more command-based way. I do not believe it made me any ruder for it; I do not type into Google:
Please, can you tell me when the Red Lion Pub on St. Paul’s Street is open until this evening, thank you.
red lion pub st. pauls open times
I did find that when my interactions with voice assistants didn’t go well, I found that any rudeness that transpired was down to the voice assistant not getting the information or action intended, therefore creating the annoyance. I then understood what the media was trying to get at with voice assistants and children behaviour (kinda). My reactions to the technology not working would bleed out into my human interactions the same way that you deflect annoyance to others when things do not work as you hope or expect. This behaviour is not specific to voice assistants, these devices are just getting the bad press currently as the latest technology, but it happens with all technology, it doesn’t work until it works.
As voice assistants take on even more human characteristics and voice becomes more ingrained into our lives, we have to be prepared for a situation where it will become indistinguishable between a voice assistant and a human, in a voice conversation. This is where the ethical lines have to be drawn when anyone, in particular children, can’t differentiate between what is human and what is a machine, now we have a problem.
If we feel politeness is a valuable part of our language, we should be advocating for the creators of these platforms to add in a cultural intent that goes beyond language accents but adapts to the country and person that is being conversated with. People, especially children work best on positive reinforcement; if you want someone to say please and thank you, then you better start saying please and thank you.
The world of work, politeness as UX
All pleasantries will go out of the window as performance will dictate how responsive and productive voice tools are within a business environment. If you can save your workers 50% of their time by dropping politeness, the majority of companies would agree that is what should be done.
Does politeness become bad UX in this situation?
“add 1 item”.
“Please add 1 item to the inventory list, thank you.”
I am still on the fence if my children should be speaking to Alexa in a polite way, though for now, I will continue to teach them to be polite to anything living and explain to them that Alexa will now be called Computer (It’s new wake word) and that it is exactly that, a computer.
If your building applications for voice, it is good to understand what your ethical guidance is. @rarelyhq now apply’s a mindfulness stage to voice app development with the aim to understand the impact of the proposed voice design on users emotional behaviour.
Future robot overlords I didn’t mean any of that, please keep me alive 😊