Protecting Highly Sensitive Data (e.g. PII) with Homomorphic Encryption
when the data has to be accessed or processed outside your secure, enterprise perimeter.
The National Institute of Standards and Technology (NIST) classifies some of the following data as Personally identifiable information (PII) — Full name, Face, Home address, Email address, National identification number (e.g., Social Security number in the U.S.), Passport number, Vehicle registration plate number, Driver’s license number, Fingerprints, Credit card numbers, Date of birth, Genetic information, Telephone number, etc.
In a legal context, PII is information that can be used on its own or with other information to identify, contact, or locate a single person, or to identify an individual. There are many laws, like the much talked about GDPR, California Consumer Privacy Act, India’s proposed Personal Data Protection Bill, etc. which regulate how PII is used by enterprises that collect PII.
Lawfully collecting PII is vital for many companies. However, as PII is so valuable, it also has a very shady, dark world associated with it. I have written a bit about it here.
Developing insights from highly sensitive data like PII needs to balance two aspects — 1). the need for Usability and 2). the need for Privacy. This is a trade-off that is often forced upon us.
What if you could get both Usability and Privacy Simultaneously?
Leveraging Homomorphic Encryption when the data has to be accessed or processed outside your secure, enterprise perimeter.
The current approach of data processing (for e.g. Training, Testing and Inferring in the context of machine learning, neural networks) exposes all the underlying data in a plain text format. These compute workloads are also mostly outsourced to third-party vendors. It is not an exaggeration to think that a malicious actor can alter the network graph and alter the weights. Take a look at these examples below. This threat is only growing.
Emerging technologies like Homomorphic Encryption can marry machine learning and cryptography to protect Highly Sensitive Data. Training, Testing and Inferring in the context of machine learning can be performed directly on the encrypted data without having to decrypt it first.
Earlier this year, SAP reported on the launch of SAP’s Guiding Principles on Artificial intelligence (AI) where SAP states — “One example of how SAP lives by these principles itself is Homomorphic Encryption”
Similarly, Microsoft too released its Microsoft AI principles where it mentions Fairness, Inclusiveness, Reliability & Safety, Transparency, Privacy & Security Accountability as guiding principles. Satya Nadella’s talk at Microsoft Build also touched upon how Homomorphic Encryption would allow enterprises to analyse encrypted data without needing to see the actual data in plain-text ever.
Here is a preview of the Highly Efficient Homomorphic Encryption Library from Ziroh Labs. It is purpose built to operate on String Data Types with very high speeds and acceptable levels of security. Our Homomorphic Encryption Library for Numeric Data Types will be released shortly as well.
Ziroh Labs is a deep tech start-up born in a research lab at the Indian Institute of Science, Bangalore, India. We develop Privacy Preserving Technologies with a focus on Fully Homomorphic Encryption.
The key differentiator of the Ziroh Labs Homomorphic Encryption primitive is in the algebraic structure used. Our techniques are near bootstrap free. This means that in a large number of operations, we do not have to undergo costly bootstrapping techniques to keep the cipher-text constant or manageable.
You are very welcome to reach out to me for additional information at bhaskar [at] ziroh [dot] com or access to a trial license of the Ziroh Labs Homomorphic Encryption library.