The biggest challenge facing digital assistants is our own inhibition (and more)
While watching the Tour De France last night, the commentator helpfully confirmed that the riders were travelling at around 56 KPH on the final stretch of the stage.
Being British and embarrassingly inefficient at foreign units of measurement, I thought I’d ask my Amazon Echo Dot (otherwise known as ‘Alexa’) what that might be in miles-per-hour.
“I’m sorry,” she replied, “I’m afraid I don’t know that one.”
Having pointed out it was a question I’d posed rather than a poorly-timed joke, I asked again, rephrasing it slightly.
“Hmm,” my digital assistant replied. “I’m sorry — I can’t help with that.”
I gave up.
Artificial intelligence (AI) and the much touted business of ‘machine learning’ clearly has some way to go, but despite my continual frustration with episodes like the one described above, I don’t think technology is the biggest challenge facing digital assistants.
It’s something far closer to home, and is one of many reasons I think digital assistants have anything but an assured future:
Talking to robots is silly
There’s no getting away from it — talking to your smartphone or a little circular device in your living room feels a bit silly.
We’ve all witnessed it in countless sci-fi films and TV shows, but in that realm it makes perfect sense. At home, or while travelling on public transport, talking to inanimate objects is just plain daft.
If you’re alone, it’s fine, admittedly, but place just one additional human being into the mix and most of us will feel our inhibitions slowly creeping in.
Data scientists know this, and it’s their job to create platforms that free us of our inhibitions when we interact with this kind of technology. That might explain why the average salary for such people hovers at around $106,000, but even the most talented software engineers will admit that the job is a colossal one.
We’re fond of idioms
I almost guarantee that you utter at least one idiom every single day. I know I do.
You’ll do so without thinking, and that means phrases such as “let the cat out of the bag” might creep into your ‘conversations’ with digital assistants. Their ability to understand these weird turns of phrase requires a boatload of innovative programming, and I’m not convinced we’re anywhere near being able to produce it yet.
If you think that’s unfair on the many talented engineers out there (against whom I feel eternally inferior), try speaking to Siri or Alexa as you would a mate. You’ll hit pockets of brilliant programming, but more often than not find yourself banging your head against a digital brick wall.
If Apple hasn’t cracked it yet…
I started this post detailing my frustrations with Amazon’s personal assistant, but they’re not alone. Far from it, in fact.
The world’s richest company also seems incapable of producing a digital assistant that understands simple questioning.
And, by ‘simple’, I don’t mean “start a timer for 5 minutes” — I’m talking about everyday conversations. If you have access to Siri, think about the number of times it’s shrugged and pointed you to Google. For me, it happens incessantly.
That begs the question: how big does one’s budget need to be to make this stuff work?
What about industry?
We’re led to believe that there are several sectors where artificial intelligence is likely to make a signifiant impact. And, while a Terminator-style takeover is both exciting and frightening in equal measure, there are plenty of industries in which the level of creativity required to do a great job can surely only be undertaken by a human.
Would you trust Alexa with the task of designing the logo for your new business? I wouldn’t.
Oh, and if you were wondering, the answer should have been 34.7968 MPH.