An Exclusive Look at How AI and Machine Learning Work at Apple
Steven Levy
2.5K84

Here’s a summary of the article generated by iSummary https://appsto.re/us/XhHPbb.i

“Our devices are getting so much smarter at a quicker rate, especially with our Apple design A series chips. Today, those AI techniques are all the rage, and Apple bristles at the implication that its learning is comparatively shallow. “We started working on it years ago, and have done really interesting work that is practical at scale,” explains Federighi. The company encrypts user information so that no one, not even Apple’s lawyers, can read it (nor can the FBI, even with a warrant). Apple’s executives wouldn’t confirm or deny. “Show me.”Siri began using ML to understand user intent in November 2014, and released a version with deeper learning a year later. How can you do that without collecting the personal information of users?

From left, Senior Vice President of Software Engineering Craig Federighi listens to Siri Senior Director Alex Acero discuss the voice recognition software at Apple headquarters.

You can argue whether it’s the right thing to do or not, but it’s given Apple a reputation for not being real hardcore AI folks.”

So it has an increasing-returns effect.”

When Acero arrived three years ago, Apple was still licensing much of its speech technology for Siri from a third party, a situation due for a change. “You wouldn’t be able to offer that prior to this technology,” says Federighi.

In Apple’s view, machine learning isn’t the final frontier, despite what other companies say. “It’s a compact, but quite thorough knowledge base, with hundreds of thousands of locations and entities. If you love your Pencil, thank machine learning.In a sense, the machine learning mindset seems at odds with the Apple ethos. “If this doesn’t work rock solid, this is not a good piece of paper for me to write on anymore — and Pencil is not a good product,” says Federighi. “We don’t want that information stored in Apple servers,” says Federighi. The back ends are getting so much smarter, faster, and everything we do finds some reason to be connected. “All of a sudden you’re going to have users. We localize it because we know where you are,” says Federighi. Apple now says that without those advances in Siri, it’s unlikely it would have produced the current iteration of the Apple TV, distinguished by sophisticated voice control. Ideally, they are exactly the apps you intended to open next. AI isn’t new to Apple: as early as the 1990s it was using some machine learning techniques in its handwriting recognition products. That was the first revolution. To make it great, we want to own and innovate internally. Face recognition? As it had with speech recognition, machine learning improved the experience — especially in interpreting commands more flexibly. (He then describes a system that involves virtual coin-tossing and cryptographic protocols that I barely could follow — and I wrote a book about cryptography. “It’s not like there weren’t other technologies over the years that have been instrumental in changing the way we interact with devices,” says Cue. By using a neural network-trained system that watches while you type, Apple can detect key events and items like flight information, contacts, and appointments — but information itself stays on your phone. In order for Apple to include its version of a high-tech stylus, it had to deal with the fact that when people wrote on the device, the bottom of their hand would invariably brush the touch screen, causing all sorts of digital havoc. Basically it’s about adding mathematical noise to certain pieces of data so that Apple can detect usage patterns without identifying individual users.) “We’re taking it from research to a billion users,” says Cue.

Yet as the briefing unfolds, it becomes clear how much AI has already shaped the overall experience of using the Apple ecosystem. No matter where the talent comes from, Apple’s AI infrastructure allows it to develop products and features that would not be possible by earlier means.

“In Google, Facebook, Microsoft you have the top people in machine learning,” says Oren Etzioni of the Allen Institute for AI. As far as the core product is concerned, Cue cites four components of the product: speech recognition (to understand when you talk to it), natural language understanding (to grasp what you’re saying), execution (to fulfill a query or request), and response (to talk back to you). “Steve said you’re going to go overnight from a pilot, an app, to a hundred million users without a beta program,” he says. “We keep some of the most sensitive things where the ML is occurring entirely local to the device,” Federighi says. Other companies might have to analyze the whole conversation in the cloud to identify those terms, he says, but an Apple device can detect it without having the data leave the user’s possession — because the system is constantly looking for matches on a knowledge base kept on the phone. “And on new things we haven’t be able to do. Using a machine learning model for “palm rejection” enabled the screen sensor to detect the difference between a swipe, a touch, and a pencil input with a very high degree of accuracy.