Apple needs to find a way to balance privacy with the needs of machine-learning

Enrique Dans
Enrique Dans
Published in
4 min readSep 9, 2015

Apple is looking to hire machine learning experts, says the Reuters news agency, adding that the company is having difficulty getting its devices to offer the same levels of artificial intelligence as other companies are managing to do.

The nub of the matter is the company’s controversial privacy policies, which impose strict limitations on accessing information about the owners of its smartphones. These policies were originally conceived as a way of differentiating Apple from its main competitors by saying to customers: “We sell products, we don’t sell the information you keep on them to third parties.”

For example, Apple Maps data expires within 15 minutes of being introduced, and are not stored for additional processing. On this basis, it’s hard to imagine how it could recreate something like Google Now, which, having read your emails, is able to tell you which route to take to a meeting and what time you need to leave the house based on the current traffic conditions.

So what is Apple to do? Privacy is not in itself an absolute, and being able to offer it to your smartphone customers certainly isn’t something that you would necessarily want to make a song and dance about. Obviously, we want and expect privacy, and nobody likes the idea of our information being used without our knowledge or approval. It’s true that some companies have certainly abused their customers’ privacy, and also that governments have, using the excuse of security, set up vast spying networks that would have blown George Orwell’s mind.

But it’s also the case that many people, myself among them, think that the data our smartphones produce can be of use to us and that there is room for some flexibility here. If processing my information can make my life easier, and guarantees offered that it won’t be misused, then I, like many others, would probably be in favor of it being used. The success of Google Now, even if the company seems not to have understood it, has led many users to overcome their initial surprise and discomfort, and to see that this virtual assistant has shown a level of intelligence on a par with human capabilities, and have begun to use it regularly.

Apple faces a problem when it comes to finding machine learning specialists to work for it: the company may be a very attractive option, but if there isn’t any data to work with, there’s not much to be done. We’re talking about a very specific labor market here: still relatively small, very open in terms of what each company is developing, and who is leading which initiatives, etc.

As with developers, people are not just attracted by money, and they also want to know about who they are going to be working with, what the long-term plan is, what they’re going to learn, and what the challenges will be. These are people who, in the main, work in this area because they are passionate about it; the last thing they want is to find themselves in a company that is going to put a brake on their professional development. It’s a challenge for Apple too: their privacy policy is a big part of the appeal of their products, but it could be hitting their competitiveness at a time when they are trying to keep up with companies that are much more permissive.

As usual, the answer seems to be to find a compromise. A compromise that, it has to be said, looks hard to find on the basis of complicated terms and conditions about the use of this or that data, and that 99 percent of Apple customers are unaware of. The goal here is to inspire trust, to offer transparency in those areas the client wants it, and to offer options that will allow for risk-free data use.

Much remains to be done. This isn’t about simply saying: “We won’t use your data,” but instead explaining that, “We’ll use your data to try to offer you a better product, but we won’t abuse your trust.” That’s no easy task when the limits of the acceptable or unacceptable are based on so many delicate variables.

I have learned in my classes to recognize variables that while they conform to stereotypes that can always be wrong on a case-by-case basis, they also largely mirror collective realities: the German and the Central Europeans are much more concerned about protecting their privacy than the rest of the world: trying to sell services based on data intelligence in the country that has made the biggest number of complaints about Google Street View is something of a challenge. In contrast, the Chinese, and most other Asians, are much more open, and will pretty much accept anything.

The Americans tend to be more pragmatic, taking a “everything’s fine as long as the outcome works” approach. The US market is a big variable in all this, and along with different companies’ policies, will have a huge impact on the development of the industry. We are going to be seeing machine learning in more and more places, and it will have a huge influence on businesses’ competitiveness and the economy in general.

Apple’s problems in finding a way to develop machine learning show this all too clearly: this is not a black and white issue, and what really count are the shades of gray. This is neither a free for all or zero tolerance situation, and the pushing and pulling in search of a compromise will doubtless continue for many years to come.

(En español, aquí)

--

--

Enrique Dans
Enrique Dans

Professor of Innovation at IE Business School and blogger (in English here and in Spanish at enriquedans.com)