Apple and differential privacy

Enrique Dans
Jun 26, 2016 · 3 min read

As I have already pointed out on a number of occasions, Apple’s strict privacy policies reflect its commitment to its customers’ privacy, but at the same time has endangered its ability to invest or to attract talent to developing machine learning and artificial intelligence.

Apple’s privacy policies are without doubt challenging: senior management attend company events, telling anybody who will listen about its principles, clearly trying to set the company apart from players like Google or Facebook. But the lack of data about its customers’ preferences or behavior is becoming a problem for the company: the evidence suggest we want smartphones that learn from us, that respond to our preferences and that offer us options based on repetition, in short, intelligent assistants that can get to know us and adapt to our needs.

Clearly the company had to do something about this shortcoming, and the answer seems to have come in the form of iOS 10, which among other features includes the concept of differential privacy, a collection of techniques able to process users’ data without compromising their privacy. Specifically, the company promises:

“… to help discover the usage patterns of a large number of users without compromising individual privacy. In iOS 10, this technology will help improve QuickType and emoji suggestions, Spotlight® deep link suggestions and Lookup Hints in Notes.”

In other words, the company will only use customers’ data with specific authorization, and that Apple’s obsession with privacy will not harm its ability to compete in the world of artificial intelligence.

The company further specifies that none of the advances of the new operating system would make sense if they came at the expense of user privacy and that the use of differential privacy is about dealing with this issue.

Apple says that it still hasn’t begun collecting customer data and will only do so with prior consent; neither will it use photographs stored on the cloud to train its visual recognition algorithms and will instead do so via other, as yet unspecified, sources. According to some studies on these kinds of techniques, differential privacy is a complex approach that can either produce the wrong answers or useless protection. In other words, the idea of Apple offering its users the benefits of advanced analytics without sacrificing their privacy is nonsense.

The company, which will establish specific limits on the amount of a specific customer’s information can be used, is clearly walking a tightrope here to give it a competitive advantage: it wants to be able to generate huge amounts of data that in turn will feed machine learning. I would recommend reading this long, but well-written and highly explanative article on Backchannel, “How Google is remaking itself as a ‘machine-learning first’ company”.

The only companies that will be competitive in the future will be those that are able to generate data about all their activities and that use it to feed their machine learning algorithms.

(En español, aquí)

Enrique Dans

On the effects of technology innovation on people, companies and society (writing in Spanish at since 2003)

Enrique Dans

Written by

Professor of Innovation at IE Business School and blogger at

Enrique Dans

On the effects of technology innovation on people, companies and society (writing in Spanish at since 2003)

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade