Are your biometrics secure?

Devu M Shila
5 min readAug 18, 2019
While biometrics can offer good security, it’s uniqueness can bring several security and privacy concerns. And its important to learn the pros and cons of biometric solutions before you implement

Biometrics, are simply put, the unique physical features of a person. This includes features of your voice, your eyes, your finger, your face etc.

There is a growing interest to replace passwords with biometrics for authenticating to end systems due to ever-growing cyber-attacks against our legacy password-protected systems. Biometric authentication brings the advantage of security (your unique features are used for recognition and authentication), user friendliness (no need for memorizing passwords), and convenience (no password typing or drawing of patterns) when compared to traditional password or knowledge-based verification questions. Existing biometric applications range from accessing your smartphone (e.g., Apple FaceID) to securing enterprise systems and physical access, biometric screening at airports and paying with your smile (Alibaba’s Smile to Pay service).

In a typical authentication scenario, the features associated with your biometrics are stored in a backend system (e.g., Cloud) during the enrollment phase. These are then retrieved and matched with the presented biometrics in real time. On the other hand, in applications like facial expression recognition and health diagnostics, machine learning models, instead of the actual biometric features, are stored in the Cloud. In the latter case, machine learning models take biometrics features as input and output labels of interest; e.g., facial expression recognition algorithm that predicts expressions such as angry, sad or happy from images of faces.

In short, when you see a biometric system, it means that your biometric data or a machine model capable of identifying it are stored somewhere. The security and privacy of your “unique data” obviously lies in the hands of the companies handling it.

We know that data breaches happen all the time, at massive scales. The latest examples include the data breaches faced by Equifax, Marriott, Uber etc. that leaked the sensitive personal information of millions of users including social security numbers, driver licenses and more. The aftermath of these data breaches leads to identity fraud, where adversaries use stolen sensitive information to create fake documents to hoax immigration, tax, banking, insurance, and healthcare providers. Uniqueness of biometrics suggests that if the biometric identifiers stored are compromised, sensitive information will be exposed that cannot be changed or revoked like passwords or PINs. And your identity will be permanently compromised, if the data is not stored securely!

Many of the companies we talked to believe that encryption can protect biometrics while stored or in transit. Encryption, however, is not a fool proof solution. Encryption is a cryptographic approach that transforms the underlying data with some secrets in such a manner that only persons or companies that have knowledge of the secrets can see the information. The degree to which your biometrics is secure thus lies completely in the hands of the company i.e., the employees or the software handling your data and the secrets used to decrypt it. The real risk in particular, lies in how they handle the sensitive biometrics information when decrypted for processing. The recent Biostar2 building security platform vulnerability depicts the pervasiveness of this risk in biometric systems- in that case it enabled researchers to expose the fingerprints, facial recognition information, usernames and passwords of one million users.

You may propose that better securing the cloud and storing machine learning models rather than biometric data could prevent this attack. And while those are good proposals, can they really secure our biometrics.

Beyond classic cyber attacks

In a system that implements all the latest and greatest in cloud and machine learning security: are your biometrics secure? The answer is no! I will describe a couple of attacks that can be launched against even very well secured biometric systems.

  • Spoofing with publicly available information: Our biometric information is now available everywhere — your photos and videos in social media, your fingerprints on devices, coffee cups etc. This information that was once deemed to be private is no longer private with the rise of social media. Attackers can take this publicly available information and through novel technologies such as Virtual Reality and 3D printing, create spoofed biometrics. That’s exactly what we witnessed in the recent face, iris and fingerprint impersonation attacks [1] — [3], where researchers used publicly available information to create fake but valid biometrics to spoof the end system. Particularly noteworthy is the fake German minister fingerprint attack [2] and Virtual Reality based facial recognition attack [3]. One interesting fact is that these attacks even apply to the latest iOS and Android phones.
  • Fredrikson attack: Also, widely known as a model inversion attack, this attack allows an adversary to reconstruct training samples if he has access to the machine learning model or an API and some auxiliary information. The auxiliary information can be the confidence scores or the output labels. This is of concern especially for sensitive applications like healthcare where an attacker can try to reverse engineer private patient information like genetic markers. In our biometric domain, an attacker will use this attack category to reconstruct the face of the person by using a facial recognition API and a class label.
  • Adversarial learning attack: We all know that a machine learning algorithm takes in a bunch of inputs and outputs label with some confidence scores. In an adversarial learning attack, the attacker perturbs the input data to fool the biometric systems. This can be as simple as placing a yellow sticker on John’s face and authenticate to the end system as Dan.

Whether its insecurity of the Cloud or the vulnerability of existing machine learning models, biometrics and biometric authentication systems are not secure.

What about the privacy ?

The truth is that biometrics can pose severe privacy concerns as biometric identifiers such as face, gait and others, can be analyzed to gain information on people’s beliefs, and personal characteristics, including sexual orientation and health issues.

We all know that “there is no end to breaches”. It’s a cat and mouse game. The main strategy is to build defenses that makes adversaries life difficult. Significant R&D is required to develop an adversarial resistant, secure and privacy preserving user features or behaviors (biometrics and behavioral biometrics). This is the focus of Unknot.id.

What’s next ?

In a next post, we will talk about our approach of securing biometrics using machine learning and privacy preserving cryptography techniques.

Follow me and enjoy the latest technology in AI, cyber security and privacy. At Unknot.id Inc., we are on a mission to create a frictionless secure society bringing world class research advances in dynamic, context-aware, smartphone-derived human behavioral analytics to the financial, e-commerce, Internet of Things (IoT) to detect and prevent fraud in a multidimensional, frictionless and privacy-aware manner.

Originally published at https://www.unknot.id on August 18, 2019.

--

--

Devu M Shila
0 Followers

Co-Founder and CEO of Unknot.id Inc., Entrepreneur on a mission to bring frictionless secure society using privacy-aware behavioral AI.