Siri To Become Smarter!

d‘wise one
Chip-Monks
Published in
6 min readFeb 21, 2017

Context-wise, Siri is impressive, but it still falls far short of the give-and-take necessary for an actual conversation. Apple’s working to fix that.

Siri’s had several brain transplants!

It wasn’t done in a day or a week or over a few months. Almost since the day Apple introduced its voice assistant back in October 2011, Siri has undergone an almost continual series of brain transplants that shifted its silicon-powered mind from pure Artificial Intelligence to AI powered, in part, by machine learning.

Apple recently shared its perspectives on artificial intelligence and where it fits in the Apple ecosystem, which is, apparently, everywhere.
Another question the team at Apple ponders on is how AI can be grown while respecting users’ privacy. In particular though, they focused on how the introduction of machine learning could transform its now five-year-old digital assistant.

Machine learning is considered a toolset within AI — it’s a way of building Siri’s ability to respond to conversational queries. Siri learns concepts by being fed endless numbers of examples. In other words, Siri will understand how you might ask a question about direction, not so much by having every possible permutation of mapping questions, but by recognizing what a map question sounds like, based on all the other examples it’s been fed.

In Siri’s case, the core technology behind the assistant is 100% different than what consumers encountered on the iPhone 4S five years ago. It has gone from a rules-based system to machine learning and voice recognition.

Most users were oblivious to the changes, which might be considered a kind of victory, while others, Apple said, noted a distinct improvement in Siri’s ability to understand natural language.

AI History

Apple’s interest in artificial intelligence didn’t spring forth out of the ether in 2011. Almost 25 years ago, a relatively simple form of AI appeared on Apple’s Newton, the first PDA. That groundbreaking product ultimately failed, but it had its moment.

I remember when a former publication, PC Magazine, lauded the mobile device for its trainable handwriting recognition. Apple continued to work on AI-infused technologies for years, but the introduction of Siri in 2011 served as a sort of inflection point, quickly becoming the most visible part of Apple’s AI work. Even so, Siri is far from alone in Apple’s current AI strategy.

Earlier this month, Apple CEO Tim Cook told the Nikkei Asian Review that AI is “horizontal in nature, running across all products”. More importantly, it’s already being used by Apple “in ways that most people don’t even think about”.

Behind the scenes, Apple’s AI works to manage product battery life based on usage patterns and what Apple has learned broadly about battery usage to manage power consumption at a component level. The facial recognition in Photos is also powered by AI. It’s even at work on the iPad Pro to ignore errant swipes of hand or Apple Pencil.

Sounds simple, but to do something like that, the system must understand the user’s intention, which can vary.

New Brain, Better Thoughts

When Apple started using machine learning, they saw a dramatic improvement in Siri’s speech recognition, especially accents and also vastly improved was Siri’s ability to understand speech in the presence of background noise.

Even so, Siri suffers from the same issue as other voice assistants: It can’t hold a conversation.

Yes, Apple spends lot of time building personality (ask Siri if it’s AI and it’ll respond, “Sorry, I’ve been advised not to discuss my existential existence”) and cultural intelligence into the AI, and Siri can fake it — to a point.

Ask Siri if you need an umbrella today and it’ll give you the weather forecast and if you immediately ask her “What about tomorrow?”, it’ll know you’re still talking about the weather and possibility of rain and give the right response.

Context-wise, it’s impressive, but Siri still falls far short of the give-and-take necessary for an actual conversation.

However, it’s worth remembering that Apple introduced the term “voice assistant” to the digital lexicon (much like it did Personal Digital Assistant decades ago), and it takes that term seriously.

I can almost hear the mirth running around in your head, but I’m serious. There’s a lot that that Apple’s doing for Siri and it’s ability to help you.

Future versions of Siri may do far more than just engage in time-burning chit-chat. A true assistant can be proactive. The current version will tell you, based on traffic conditions, when you need to leave to make an appointment. Eventually, Siri might start to connect the dots on, say, the state of the phone and how far you must travel and tell you to charge up before you leave. Of course, Siri’s ability to grow may be somewhat limited based on one of Apple’s core principles: user privacy.

User Privacy!

Google’s impressive intelligence and increasingly proactive nature is largely based upon its Knowledge Graph and what it knows about you (and billions of other people) and the relatively persistent user profile that travels with you from Chrome login to Chrome login. Apple on the other hand, does nothing of the sort. In fact, Apple insists that its brand of AI doesn’t need to build a profile of you to work and they don’t have an economic incentive to do so.

Apple can get away with ignoring your personal data because it’s not trying to deliver contextual advertising to you. Of course, Apple sells hardware while Google sells (recent hardware releases not withstanding) primarily contextual advertising driven by user data.
Apple sells millions of iPhones, iPads and Macs each quarter and has an exploding services business, which means Apple can get away with ignoring your personal data because it’s not trying to deliver contextual advertising to you.

While Google’s intelligence and AI-powered responses come from Google’s servers, Apple generates most of Siri’s intelligence locally. The company trains the AI in the cloud, where, Apple said, it’s getting 2 billion queries a week, and then delivers that intelligence to each Siri-hosting Apple device (these are the occasional brain transplants). Those devices then apply that intelligence to your locally stored data.

More interesting, though, is that Apple also does some machine learning on your iPhone. Apple believes it has the advantage here over competitors because it designs its own chips and contends that it’s significantly ahead of others in the mobile technology space,

Unlike Google and Amazon (parent of the voice assistant Alexa), Apple designs both the software and hardware — a strategy it believes gives it an advantage, including the ability to do neural processing at the silicon level on devices as small as the Apple Watch.

Apple’s approach to AI ‘is a laudable’

I think that there’s real-world proof about being able to go do distributed machine learning without every node in the cluster having access to all the data”, McClellan added, noting that it is quite possible to do consensus-based artificial intelligence with more anonymous data.

Even as McClellan gives Apple high marks for its approach to data, he wonders about Apple’s lack of participation in the newly formed Partnership on Artificial Intelligence, which counts IBM, Google, Facebook and Amazon among its members: “It feels like Apple should be more open, in general”.

How far Apple will go without being more open and joining other companies in their efforts to keep AI technologies from getting away from their masters, and how smart an AI can truly become without building customer profiles, are fair and open questions.

For now, at least, this is the path Apple chosen for its brand of AI, and one thing is clear: The Siri you’re using now will undergo further brain transplants and be far different that the Siri you use five years from now.

Originally published at Chip-Monks.

--

--