Siri on the Mac? Lets make it work first.
Siri is the perfect example of the great new frontier of technology. Its so exciting — about what it might someday become. But today… it just kinda sucks.
Tonight i got in my car to go to a ribbon-cutting event. Just moments earlier my iPhone had alerted me that this event had just started. (Yes, I was about 10 minutes late.) Since I knew the general location of where I was going, I hopped behind the wheel.
Now first, let me point out that this is awesome. I spent no brain time contemplating if I knew where I was going. I completely trust that my tech will get me there — so much so that I don’t even stop to think. But. For some reason, I continue to give Siri the benefit of the doubt as my first choice for finding the information I seek.
As I rolled out of the parking lot, I called her from her hiding place. I held down the home button and waited for her attentive ‘ding’. No dice. Of course this is expected, and I have hacked this issue into submission. I look down to my phone and tap the on-screen microphone. There she is.
Siri: ‘ding’
Me: Hey! (In my car I need to say some meaningless word first or else Siri won’t understand me. Often I say “Yo!” or “Shazam!” or “Opah!”.) I need directions to this event I’m going to tonight.
Siri: Sorry, I don’t understand ‘this event i’m going to tonight’.
Me: Hey, I need directions to my calendar event.
Siri: Sorry, I’m not sure what you said there.
Me: Hey, I need directions to the event happening right now on my calendar.
Siri: I found 4 places matching ‘event planning’ near you.
And I’m out.
Oddly enough, my concern with Siri is not the frustrating lack of ability to find my calendar event, which definitely had a location attached to it. No my deeper concern is the brain hacking I have to do to make Siri function at all.
Lets start with Bluetooth. So cool, but so woefully imperfect. (I know this is not Siri’s “fault”.) I have had 3 cars with Bluetooth integrations, and the successful use of Siri in them has been all over the map. None of them “just work”. I have to know the tricks.
Invoking Siri. Lately, on first press, Siri just starts and then shuts down before playing a sound or even listening to me. I have to press it twice, with my highest rate of success being to tap the on-screen mic icon.
I know, I know… I can do a hard reset… or a software restore… or something like that. But I keep my phone very clean. I have one page of apps, and not a lot of junk. If Siri struggles to work reliably on my phone… what is it doing on my mother’s phone? Resets and restores are definitely not a good answer in my book.
The promise
I hold out hope. I believe that technology will someday melt away. And I’m watching with anticipation. I bought the Apple Watch for exactly this reason. I don’t want my face in my phone all day. I want the things I need to know, to require very little of me to be known. I love the idea of the tap on the wrist. Even though it is nowhere close to living up to its promise right now (topic for another day).
The promise of Siri is a digital assistant who can “understand” you. What that really means is deep and powerful. It means I should be able to express myself in the way that I actually express myself, and you should be able to express yourself in the way you do, and my mom should be able to also — and Siri should understand all of us.
So many of the things I do on my phone I do inside of apps. But Siri can’t help me there. Why? I ask... Why? As a developer, I understand that launching an SDK for Siri will be extremely complex, but it simply has to happen. I usually tell myself that its because in order for it to work, we will probably have to use things like keywords that are assigned to apps so that Siri can process our intentions — and at the beating heart of Siri is this idea that we shouldn’t have to out-think her. We should be able to just talk to her.
But what we have now is a pale facade. In order to effectively use Siri, we are required to adapt. I find myself saying things to Siri in speech patterns I do not use in real life.
So which way do we go — more artificial? Do we have to learn the Siri language to effectively use her? Or more natural? Can we put the burden onto developers to create tons of meta data for apps so that Siri can process and route natural language requests on the incalculable number of phone configurations that exist?
I do not know. But I do know that Siri is mostly useless to me right now, so inviting her onto my Mac just feels like more of a burden than a help. Now when I accidentally say something that sounds like Hey Siri, I’ll get four devices playing me songs I never requested? Can’t wait.