An institutional right to privacy
We all value our privacy. But what does “privacy” mean when we value it? It depends. Sometimes, we value the privacy of our homes where we want to be left alone, isolated from the outside world. Or we may not want our children’s school to post photos of our loved ones on public school websites. Or we would not want biometric data that our car holds on us be shared with our insurance company.
So “privacy” is a complex and evolving concept. As digital technology reaches ever deeper into our daily lives and intimate spheres, we really want to bake privacy into the design and operation of such technology. The alternative is that privacy will retain little value. For example, there is no point in a right to be left alone if devices in our homes can record our movements, speech, and such and we can neither invoke such a right to control the generation of such data nor to control its sharing with other parties.
Privacy, in its broad sense, is a cultural value embedded in European society. And this is reflected in European regulation, such as the EU General Data Protection Regulation (GDPR). This regulation provides strong data-privacy guarantees to people living in the EU. For example, it requires a legal basis for third parties to hold and process our personal data and we can ask for our personal data held by others to “be forgotten”.
The perception of data privacy is often one of data protection, which restricts access to private data. But data privacy is also about controlling who can process personal data and to what end. For example, your bank might retrieve and process personal data of its interaction history with you to support a decision on your loan application, if doing so is consistent with the data processing agreement that the bank has with you.
Why Privacy is so critical in AI
Artificial intelligence (AI), and in particular its algorithmic form of machine learning, gain rapid traction in replacing or crucially extending what we understand as data processing. Machine-learning algorithms can identify valuable patterns in data that humans could not possibly spot and then creates AI models out of such patterns. Increasingly, these AI models form the basis for decisions that can greatly impact our lives, for better or for worse.
So why is privacy so important in AI? One big reason is that the training of models requires a lot of data that is often of private and at times very intimate nature. How would you feel about your voice assistant recording all your conversations at home and uploading these recordings to a third party for the purpose of training an AI model that improves the voice assistant? Would you share your everyday conversations with what are effectively strangers to you, for the benefit of a better performing voice assistant? We would not.
There is also the concern that very large IT and AI companies collect more and more of such data about us to get comprehensive and more accurate predictions about our behavior. Without wanting to get too Orwellian here, such detailed knowledge of our personal data can not only be used to predict how we might behave. It can also be used to determine behavioral nudges that these companies can offer to us, in subtle ways, such that these nudges gently push us to behave in a manner that is most beneficial to those companies and the services they offer. In short, this can turn AI prediction — a probabilistic guess — into predictable behavior that no longer seems to require guesswork.
This threat of a “behavioral loop” is real and challenges perceived values of autonomy and self-determination. We believe that people living in Europe should retain not just their sense of control over their personal data, but also genuinely have such control. This will therefore give us more autonomy as social, commercial, and communal actors.
So one might be tempted to declare AI as the enemy of privacy. At the same time, AI seems to be a great enabler. It can offer more personalized mobility experiences, more effective medical treatments, prevent accidents in our homes — to name a few examples.
Solving the problem with Privacy-Enhancing Technologies
So what is one to do? At XAIN, we are convinced that people can retain control over their personal data while also allowing the development of effective AI — with benefits to all. The key to resolving these apparently conflicting aims is to bake privacy into the AI itself, using privacy-enhancing technology (PET). This is a complex subject that combines cryptography with algorithms, information theory, and other research areas. And there has already been great progress in devising PET approaches that can be used for bringing privacy to AI.
One important part of this seems to be to allow personal data to stay in the control sphere of its person. Under this restriction, XAIN uses its own PET approach to federated learning to create AI models that capture insights from data of many persons. Importantly, XAIN can compute these models without actually being exposed to the personal data itself, using cryptography. And the computed models will no longer allow inferences to be made about personal data, while still being useful for prediction.
At XAIN, we firmly believe that the transfer of privacy-enhancing technology into AI products requires not just technical mastery of PET. We think it is equally important to understand the legal and regulatory aspects of data privacy and how privacy-enhancing technology can be integrated into AI to best meet regulatory demands. We equally advocate that privacy solutions for AI should be communicable to both consumers and businesses as being transparent, accountable, and trustworthy.
Such privacy solutions for AI are exactly what XAIN aspires to create. We are positive that XAIN and the European ecosystem in which we operate can make a contribution with global reach, by leading the development of secure and privacy-preserving AI such that control over data and models is not centralized in the hands of a few powerful players but distributed in a manner that reflects European structures and values.