Wake up and Keep Up: A Deep Dive into Biometric Data Protection

Mica
WRIT340EconFall2022
11 min readDec 6, 2022

A Government Lacking in Data Governance

The current evolution in biometric usage and data serves as a call to action for all policymakers who work with data protection and governance as it relates to personal security and privacy. With current policy there is a severe lack of protection for biometric data, but it’s important that we remedy this because this is data that cannot be changed if leaked or misused like a credit card number would. To remedy this lack of protection we need to make sure the definition of biometric data is up to date and does not lag behind the quick evolution of technology and its applications. While ensuring the accuracy of this definition, we must increase the general standard of data protection and security for all organizations and companies involved, and implement this standard on a federal level so that all citizens, no matter their state of residence, are protected and companies are reprimanded if they do not sufficiently protect this sensitive information.

Why Biometric Data Protection Deserves Attention

In a world where people love automation and simplicity, it’s no surprise that people are trading in passwords for fingerprints, facial recognition, and other biometric data. After all, why struggle remembering a password when you can use a body part that is always there? This trend has ushered a new conversation of data collection, and how biometric data specifically is being stored, processed, and protected. Due to the unique nature of using data that, unlike a credit card number, can’t ever be exchanged, privacy and security are of upmost importance, not only to protect the people from fraudulent activity, but to ensure governments and companies alike aren’t using it to further target marginalized groups and enable discrimination. The current laws and data processing norms are outdated and doing the bare minimum to protect the user from data leakages, perhaps due to a lacking in consumer awareness or care as to how their data is being used, but even more alarmingly, do not do much to address misuses of data that allow marginalization to occur. As the definition of biometric data expands due to the inevitable evolution of automation and simplicity, so must the protection and vigilance of this sensitive information.

The Flaws in Existing Biometric Protection Laws

Let us first consider the federal definition of biometric information and its faults. The current federal policy S.440, the National Biometric Information Privacy Act of 2020, does not even define what biometrics is in the act. This is problematic because any argument made against companies abusing or not protecting biometric information can argue that they did not even know it required such protection given that a definition was not provided under this act. In newer, somewhat related acts, however, such as the H.R.6733 Ban IRS Biometrics Act, the bill defines it as “any information regarding any measurable physical characteristic or personal behavioral trait used to recognize the identity, or verify the claimed identity or location, of an individual, including facial images, fingerprints, and iris scans.” (“H.R.6733–117th Congress (2021–2022): Ban IRS Biometrics Act.”). Using this definition one can begin to defend oneself, however it runs the risk of becoming outdated behind innovative use of biometric technology in a few years. In a research paper outlining how biometric data can further extract ancillary information from a user, Dantcheva defined biometric traits as “face, fingerprints, hand geometry, iris,” and Natgynanathan who also wrote about the protection of privacy in biometric data, even factored in a person’s voice, who’s unique vocal cords and sound patterns can be used for verification purposes, as well as their keystrokes. The congressional acts fail to include these traits and hence opens the door to misappropriation of personal information, with no clear expectations on protection, nor ramifications for mishandling.

Of course there are distinct differences between using your fingerprint or face to unlock a game versus using it to access your health care information or to log into your bank account. The stakes vary across use cases, and each company as a result, has differing levels of security when it comes to storing this biometric data. As of now, the current bill passed by Congress does very little to outline steps to protect and store data. Given the nature of ever evolving technology and new usages of biometric information, this is disappointing, but not surprising. The bill begins by stating that all entities who obtain biometric data must “take specified actions to maintain and ensure the privacy and security,” however, it then goes on to state that these ‘specified actions’ must be a written policy that outlines a retention schedule, meaning how long the entity can keep the data for, and a ‘guidelines for destroying such data.’ Besides those two actions, there are no further steps that the organization should take. With regards to the ‘guidelines for destroying such data’ that companies must enforce, they have two options they must take into consideration for when they would need to destroy the data. The first being the ‘date on which the initial purpose for collecting the data has been satisfied” or one year after the user last interacts with the entity. By allowing the data to be kept until the ‘initial purpose’ has been satisfied, it leaves the door open for companies to argue that it simply hasn’t yet fulfilled its purpose. In addition, the written policy does not need to explicitly include how entities are protecting data, or disposing of it, which leaves it up to interpretation of each company. Finally, the ‘guidelines for destroying’ does not clearly state whether a soft delete or a hard delete is necessary; in this industry there is a clear difference between the two, and depending on which one is mandated, can make a drastic difference in the protection of biometric data.

The bill continues by stating that the entity cannot obtain this information in the first place unless needed to run their business or provide a service, and if they do so they must inform the user in writing and get a written release. Immediately, any company that is inclined to use the words “secure” or “protect your privacy” in their marketing may do so and claim it is necessary to collect biometric data to control who has access to their service and ensure their user’s privacy. In addition, informing the user in writing and getting a written release translates to the infamous terms and conditions, where users quickly scroll through and check the little box. While technically entities are fulfilling their obligations to this bill, they do so in a way that they know most will ignore. Following this the bill states that the data cannot be sold, leased, or otherwise profited from, but there is nothing safeguarding the user against the entity using the information themselves to target and marginalize specific audiences. Only after all this does the bill first talk about how the entity is to protect this sensitive information, saying that they “must store, transmit, and protect” in a way that is “the same as, or more protective than” the way they treat their “other confidential and sensitive information.” In other words, the bare minimum level of security not only varies drastically from company to company, but it is left wide open for manipulation as entities change their confidentiality policies. The bill concludes with the users rights to access this information and rights of action in the event there is “a violation of the bill’s provisions.” So in the event that the entity sells their data, doesn’t provide a written policy, or doesn’t inform the user of the collection, the individual can fight, however no protective sanctions are made in the event that the company has a data leakage as a result of not adequately protecting the security of their data. In other words, because there are no written standards for minimum level of security, the user is left vulnerable and unable to punish entities for not protecting their privacy (“S.4400–116th Congress (2019–2020): National Biometric Information Privacy Act of 2020”).

Only seven states have acknowledged that the congressional biometric bill is lacking and decided to implement further legislation to remedy its weaknesses. Figure 1.1 depicts Bloomberg Law’s mapping of which states define and regulate biometric data. Alarmingly, a vast majority are in the gray or green areas, meaning they have no bills or have some that are currently being proposed. This map demonstrates how drastically different every citizen’s protection varies from state to state, with Washington being the most specific on its definition. Notably, California was the first and only one to address the users private right of action and proposed a new bill in which the user can act in the event of a data leakage due to insufficient or failing security protocols (DiRago). However, many still are in the process of becoming law, and other states are lagging behind in data protection and governance.

Figure 1.1 “Biometric Data Privacy Laws and Lawsuits.”

This severe lag in data protection and governance has massive repercussions, the first and most obvious being fraudulent activities such as impersonation. Once the information is out, the threat becomes ‘serious and continuous’ given that these biometrics are unchangeable (Natgunanathan). The second is marginalization. Wevers explains that once an entity has the biometrics, it can do further analysis to extract other information, such as ‘gender, age, ethnicity, hair color, weight.’ They explain that the biometric technologies we use today are often “built around whiteness, maleness, and ability” which disproportionally affects ‘queered, gendered, classed, and disabled bodies’ (Wevers). It goes without saying that companies having this information could lead to them using it to target specific subgroups. A prime example is Facebook who in 2018 was exposed and since then put under harsh criticism for marginalization based on information derived from biometrics. It was proven that users were grouped into racial and ethnic groups, and then targeted with specific ads or political misinformation [Zang]. Other groups, such as Snapchat, Google, and United Airlines have faced similar allegations in Illinois where the biometric privacy acts are stricter than the federal government. However, Illinois is the outlier in the sense that it is trying to prevent these negative experiences from occurring. For example, in Illinois the collection of biometrics for employees is only allowed “if used exclusively for employment, human resources, or identification, as well as safety, security, or fraud prevention” (Marotti). More states should follow suit to ensure all citizens, whether employees or simply users of a service, are not being discriminated against or made vulnerable to fraudulent threats.

Innovative Encryptions as the Solution

Several opportunities for growth for the Federal Government that are not only desired, but desperately needed. The first is strengthening the definition of biometrics and including it in the National Biometric Information Privacy Act. With the rate of innovation we are seeing in technological advances, this act must be revisited often and updated, perhaps at a rate of every five years. The second is to allow the user to protect themselves from companies who do not succeed in protecting their information. Just as California did, it is imperative to propose a new bill in which the user can act in the event of a data leakage due to insufficient or failing security protocols. Individuals should have the right (DiRago). The third is providing more instruction and building the minimum standard of data protection and privacy for all entities involved with biometric information collection. To ensure these minimum standards, encryption is of paramount importance. Iynkaran Natgunanathan has created a useful framework for classifying different types of encryption processes that must be implemented and that are already widely used in the industry.

The two most popular encryption methods currently used are PPBSs (privacy-preserving biometric schemes) and cancelable biometrics. Both systems stem from two main pillars that Natgunanathan explains as “irreversibility and unlinkability.” Once the encryption operations are encrypted, they cannot be reversed, and therefore, they cannot be linked to a single individual. Looking specifically at PPBSs, they function in the following manner:

  1. As a biometric is collected, a random key is generated for it and they are binded together to make a pair.
  2. From there another set of computations makes a scrambled copy of the pair.
  3. The original pair is discarded in a safe manner, and the copy is stored in a database.

Cancelable biometrics function similarly in the way that an intentional disfiguration of the data is performed in order to preserve the privacy of the user. The steps are as follows:

  1. The biometric is scrambled AS it is being collected, so the database never truly sees the biometric as it is.
  2. The scrambled biometric is stored in a database

While different entities might already have their own encryption methods, I argue that PPBSs should become the federal minimum for all entities using biometric information. The reason being is that it is less costly to implement and therefore can be applied to everyone, even low stake entities that use biometrics for games. I further encourage cancelable biometrics to be used for higher-stake entities, meaning any entity that is using biometrics in the Health, Finance, or Legal industries. The reason being is that there can be serious repercussions should access be gained to that person’s information in any of those related areas as mentioned earlier. Due to the increased importance, it requires increased encryption, and as explained by Natgunanathan, “the distortion functions are designed in such a way that it is computationally difficult for an adversary to recover the original biometric feature.” Both will ensure that the information is sufficiently protected, but add another layer of protection for extra sensitive information by applying it on a federal and industry wide level.

The time to pay attention to such sensitive information passed long ago, and it is urgent as well as critical that all entities wake up and keep up with protecting this unchangeable data.

With these encryption standards, a continuously updated definition of biometrics, and an increased awareness of what and how our biometric data is used, the federal government can begin to keep up with society’s newest technological innovations and security demands.

Works Cited

“Biometric Data Privacy Laws and Lawsuits.” Bloomberg Law, 4 Nov. 2021, https://pro.bloomberglaw.com/brief/biometric-data-privacy-laws-and-lawsuits/.

DiRago, Molly S. “A Fresh ‘Face’ of Privacy: 2022 Biometric Laws.” Troutman Pepper — A Fresh “Face” of Privacy: 2022 Biometric Laws, 5 Apr. 2022, https://www.troutman.com/insights/a-fresh-face-of-privacy-2022-biometric-laws.html.

A. Dantcheva, P. Elia and A. Ross, “What Else Does Your Biometric Data Reveal? A Survey on Soft Biometrics,” in IEEE Transactions on Information Forensics and Security, vol. 11, no. 3, pp. 441–467, March 2016, doi: 10.1109/TIFS.2015.2480381.

“H.R.6733–117th Congress (2021–2022): Ban IRS Biometrics Act.” Congress.gov, Library of Congress, 15 February 2022, http://www.congress.gov/.

Marotti, Ally. “Ill. May Change Law Protecting Biometric Data: Senate Proposal could Give Employers More Control Over Information, Opponents Say.” Chicago Tribune, Apr 11, 2018, pp. 1. ProQuest, http://libproxy.usc.edu/login?url=https://www.proquest.com/newspapers/ill-may-change-law-protecting-biometric-data/docview/2023658235/se-2.

Natgunanathan, A. Mehmood, Y. Xiang, G. Beliakov and J. Yearwood, “Protection of Privacy in Biometric Data,” in IEEE Access, vol. 4, pp. 880–892, 2016, doi: 10.1109/ACCESS.2016.2535120.

“S.4400–116th Congress (2019–2020): National Biometric Information Privacy Act of 2020.” Congress.gov, Library of Congress, 3 August 2020, http://www.congress.gov/.

Wevers, R. (2018). Unmasking Biometrics’ Biases: Facing Gender, Race, Class and Ability in Biometric Data Collection. TMG Journal for Media History, 21(2), 89–105. DOI: http://doi.org/10.18146/2213-7653.2018.368

Zang, Jinyan. “Solving the Problem of Racially Discriminatory Advertising on Facebook.” Brookings, Brookings, 9 Mar. 2022, https://www.brookings.edu/research/solving-the-problem-of-racially-discriminatory-advertising-on-facebook/.

--

--