Face: The Final Frontier of Privacy — Full Spoken Congressional Testimony May 22, 2019

Joy Buolamwini
4 min readMay 23, 2019

--

Thank you Chairman Cummings, Ranking Member Jordan, and committee members for the opportunity to testify today.

I am an algorithmic bias researcher based at MIT. I’ve conducted studies showing some of the largest recorded gender and skin type biases in AI systems sold by companies including IBM, Microsoft, and Amazon.

You’ve probably heard facial recognition and related technologies have some flaws.

In one test I ran, Amazon Rekognition even failed on the face of Oprah Winfrey, labeling her male.

TIME Magazine Op-ed

Personally, I’ve had to resort to literally wearing a white mask to have my face detected by some of this technology.

Coding in whiteface is the last thing I expected to be doing at an American epicenter for innovation.

https://Ted.com/talks/joy_buolamwini_how_i_m_fighting_bias_in_algorithms

Now given the use of this technology for mass surveillance not having my face detected could be seen as a benefit. Besides being employed to dispense toilet paper, In China the technology is being used to track Uighur Muslim minorities.

Beyond being abused there are many ways for this technology to fail. Among the most pressing are misidentifications that can lead to false arrests and accusations.

Just last month in Rhode Island, a Brown university senior preparing for finals was misidentified as a terrorist suspect in the Sri Lanka Easter bombings.

The police eventually corrected the mistake, but the damage was done. She received death threats and her family was put at risk. Mistaken identity is more than an inconvenience and can lead to grave consequences.

At a minimum Congress should pass a moratorium on police use of facial recognition as the capacity for abuse, lack of oversight, and technical immaturity pose too great a risk -especially for marginalized communities.

The Brown University student like me is a woman of color under the age of 30. We fall into multiple groups that this technology repeatedly fails the most — namely people with non-White skin, women, and youth.

Due to the consequences of failures of this technology, I decided to focus my research at MIT on the accuracy of facial analysis systems These studies found that for the task of guessing the gender of a face, IBM, Microsoft, and Amazon had error rates of no more than 1% for lighter skinned men . In the worst case those error rates soared to over 30% for darker skinned women.

Given such accuracy disparities I wondered how large tech giants could have missed these issues. It boiled down problematic dataset choices. In evaluating benchmark datasets from organization like the NIST National Institute for standards and Technology I found surprising imbalances.

One prominent NIST dataset was 75% male and 80% lighter skinned. Or what I like to call a pale male dataset.

We cannot adequately evaluate facial analysis technology without addressing this critical issue.

Moving forward the demographic and phenotypic composition of NIST benchmarks must be made public and updated to better inform decision makers about the maturity of facial analysis technology.

The harvesting of face data also requires guidelines and oversight. Companies like Facebook have built facial recognition capabilities by training their systems on user face data without expressed consent. But Regulations make a difference.

As a result of GDPR, instead of a default, Facebook now makes facial recognition an opt in feature for users in Europe.

Americans should have the same assurance that they will not be subjected to Facebook facial recognition without consent.

No one should be forced to submit biometric face data to access to widely used platforms, economic opportunity or basic services. Just this week a man sued Uber after having his driver’s account deactivated due to facial recognition failures. Tenants in Brooklyn are protesting the installation of an unnecessary face recognition entry system. New research is showing bias in the use of facial analysis technology for healthcare purposes. And, facial recognition technology is being sold to schools, subjecting children to face surveillance.

Our faces may well be the final frontier of privacy. Congress must act now to uphold American freedoms and rights. At a minimum, Congress should require all federal agencies and organizations using federal funding to disclose current use of face-based technologies. We cannot afford to operate in the dark.

Thank you for the invitation to testify, I look forward to your questions.

Full written testimony; https://docs.house.gov/meetings/GO/GO00/20190522/109521/HHRG-116-GO00-Wstate-BuolamwiniJ-20190522.pdf

--

--

Joy Buolamwini

Founder Algorithmic Justice League. www.ajl.org | www.poetofcode.com | Telling stories that make daughters of diasporas dream and sons of privilege pause