How Talking to Smart Speakers Might Change How We Interact with Technology and Other People
Google Assistant and Amazon Alexa are both viable and veritable option to get an always-on assistant into your household. One you can always talk to and that might in some way become part of your everyday life in the form a constant companion.
Assistants, and Google Assistant specifically, are very different from the classic technological interface we normally use to access and retrieve information. Instead of having an input method in form of a keyboard, we need to use our voice. Instead of showing information on screen, Assistants must talk back to us (Of course, this is only true for Google Home-like devices — on smartphones and smart displays assistants can rely on text input and visual representation).
When the Assistant works perfectly well, this approach creates a more natural, conversational mode of operation. If you want to know specific opening hours for a specific shop, you really have to know the name of the street it‘s on. You can‘t just say “Costco opening hours”, because then you might get the wrong opening hours for the wrong place in the city. You actually have to say, e.g., “opening hours of Costco on 52nd street”. Or, if you do ask about “What are the opening hours of Costco”, you still need to know the address of your closest Costco, since the Assistant will tell you the opening hours in combination with the address.
I think that‘s a good thing. It forces us to think more and memorize more, something that is endangered in the age of “Let me just google that.”
It already starts with much simpler things. You‘ll have to word out a sentence when you talk to your smart assistance. Sometimes it‘s enough to say “Opening hours Costco”, but it doesn‘t feel natural. You‘ll want to say whole sentences and sometimes you have to first pause before you speak and think about what you want to say. That helps you structure your thoughts. In contrast, let‘s look at a typical Google search query:
You‘ll notice that this sentence is much less elaborate than the spoken command. But it‘s good enough and Google will be able to process it. However, when you talk, it’s just way more natural and conversational to ask, “Hey Google, what’s the weather in Berlin?”
Also, another thing I came across: We expect our intelligent assistances to pretty much read our thoughts, even when, obviously, even humans often have trouble understanding each other. I noticed that myself: I was trying to pause loud music in the kitchen and got so frustrated that Google Home wouldn‘t hear me, but then I thought, well, that music is really damn loud, no wonder it couldn‘t hear me from a distance. I should move closer.
This teaches us to be more forgiving with tech. Which is a good thing, because nobody‘s perfect and no machine is perfect — even though we’ve come to expect that.
This perceived entitlement to perfect technology is dangerous. It makes us forget to be kind and courteous with one another, to be forgiving. Instead, we’re becoming accustomed to machines without feelings and we might lose a lot of social abilities if we continue down this path. This, of course, is only the most dystopic way of reading the future of smart speakers.
Google is putting in some effort to at least force our children to be more courteous with their smart assistants. You can switch a toggle in the Assistant’s settings that forces you to use “please” and “thank you”. And if you don’t say the magic words, the Assistant won’t do what you asked it to do.
Of course, this technology is still in its infancy. If we really want more social interaction with any Assistant, it needs to be able to respond to emotion. It needs to understand that you’re screaming at it, and it needs to understand why you might do that — for example, when it stubbornly refuses to do what you ask it to do because you’re not phrasing your command the right way. It might react with something like, “I’m sorry, I’ll try better” or “Calm down, stop screaming, that’s not going to help with anything.” Only then you feel bad for screaming at your tech.
I would guess that a company selling smart assistance speakers would rather opt for the first response — “I’m sorry, I’ll try harder”. They won’t want to alienate their customers by patronizing them. Thus, they will take a more defensive stance and will try to bring the best experience to the customer without making him or her feel bad.
I think it’s important to be courteous towards intelligent assistances. If we stop being nice to our assistances, it is only a matter of time until human social interaction becomes rougher, and then it might only be a matter of time until society becomes more brutalized and impatient in their social interactions.
Still, having conversations with machines is a good first step towards reflecting on our use of technology and how it impacts our social behavior. It also teaches us a lot about ourselves. Not everyone is screaming at Google when it doesn’t understand them. And some people will start reflecting how they treat their devices and their fellow human beings.