Salaah Swan
Apr 21 · 4 min read

Welcome To The Era Of Facial Recognition Technology: Changing The Game On How Minorities Are Treated

Women are disproportionately targeted by Amazon’s Rekognition surveillance AI

George Orwell’s “1984” probably never seemed so prescient as it does in the present day. From the depths of Orwell’s imagination came a fantastic narrative of a dysfunctional, unsympathetic, dystopian society inhabited by those who have and those who have-not. Mildly comparable to the current reality of us, despotism seemed to be such an improbable reality in the minds of most that quietly, our actual existence has discreetly snuck upon mankind. And there is no fantasy involved, this is all very real.

Welcome to the era of facial recognition technology and smart, learning algorithms known as Artificial Intelligence (hereinafter referred to as AI). While most humans strive to live fulfilled, healthy lives as contributors to society, builders of infrastructures, creators of art and things, unbeknownst to us none of that matters as our mere existence is classified as much more simpler groups in the lens of AI facial recognition technology’s surveillance: those with light color skin tones and those with dark color skin tones. Those who are men and those who are women. The distinctions of these classifications is important because this wildly smart, always learning AI, created in the likeness of the Silicon Valley man, was also created with the inherent biases of these men. Simply put, facial recognition AI, according to independent research, is biased against women, and especially biased against women with darker skin tones.

Audit performance results: Facial recognition surveillance technology from Amazon has a 32% rate of error in analyzing the faces of women with darker skin tones

The impact of these algorithmic biases within the function of facial recognition AI has now made its societal entrance on the world stage of scrutiny, as the technology is being widely used in the retail surveillance systems of big box retailers like Whole Foods Market (owned by tech giant Amazon) and Target. For many ethnic consumers the shopping experience at a Whole Foods Market quickly turns to emotional distress and angst as, according to a content analysis of complaints based on consumer experiences, Whole Foods surveillance targets shoppers with darker skin for harassment at a rate that is beyond comparison. In a nutshell, Amazon’s AI doesn’t trust or like you brown people. And because AI doesn’t trust or like people with darker skin tones, these individuals will always be perceived as, and treated as suspicious. This built-in suspicion of the dark skinned mother shopping for probiotics, or the suspicion of the South American man shopping for grapes, and the heightened actions of store security personnel toward people of color as a result thereof, undoubtedly constitutes a hostile and discriminatory system born from a xenophobic ideology at its root.

Researchers Conduct Groundbreaking Study Of Facial Recognition AI

Artificial intelligence researchers from MIT, Stanford, and the University of Toronto have conducted exhaustive research on the performance of artificial intelligence used in commercially marketed facial recognition software. The results of the study was disheartening: of the several commercial AI systems, including systems from IBM and Microsoft, Amazon’s facial recognition system Rekognition showed the highest race and gender classification bias. Performing at a level that displayed an output of 100% accuracy when analyzing the faces of white males, the same AI has an error rate of 32–35% when assessing the face of a woman of color. This means that when a Black woman’s face is analyzed in a surveillance camera running Rekognition in a place of public accommodation, that there is a 32%-35% chance that she will be misclassified as a male or of any one of several false categorizations, including that of having a criminal background. This is especially significant because these routine misclassifications are leading to increased targeting, increased interrogations, and increased occasions where people of color, in general, are treated with prejudice and malice by security personnel and/or law enforcement. In short, this is algorithmic racial profiling — or the new stop and frisk.

The Potential Of Sanctioned Civil Rights Violations Of The Marginalized

Artificial intelligence software like Amazon’s Rekognition, largely used for commercial purposes behind closed doors and in the offices of retail security, currently has the absolute freedom to be abused and utilized in unprecedented ways that damage any sense of trust between the public and those who use it to mass surveil citizens doing mundane things. While it is no secret in our society that implicit bias is in the hearts of many, research data concludes that bias is also at the center of how AI thinks, learns, makes deductions and outputs data. With this, there is a likelihood of probability that a software, any software, that exhibits such a high error rate relative to a marginalized category of people, such as Amazon’s Rekognition, is potentially contributing to the gross violations of civil and human rights of those people. As the use of this racial and gender biased technology leads to the increased endangerment, the violation of privacy, and the violation of civil rights of American citizens and immigrants alike, unregulated mass surveillance using Rekognition would undoubtedly lead to a broken, dysfunctional, dystopian society of sadness.


Resources:

Actionable Auditing: Investigating the Impact of Publicly Naming The Biased Performance Results of Commercial AI Products. Inioluwa Deborah Raji, Joy Buolamwini http://www.aies-conference.com/wp-content/uploads/2019/01/AIES-19_paper_223.pdf

Face recognition performance: Role of demographic information. Brendan F Klare, Mark J Burge, Joshua C Klontz, Richard W Vorder Bruegge, and Anil K Jain

Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification. Joy Buolamwini, Timnit Gebru http://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf

Response: Racial and Gender Bias In Amazon Rekognition — Commercial AI System For Analyzing Faces. Joy Buolamwini https://medium.com/@Joy.Buolamwini/response-racial-and-gender-bias-in-amazon-rekognition-commercial-ai-system-for-analyzing-faces-a289222eeced

Salaah Swan

Written by

Broadcast producer, documentary filmmaker, screenwriter, and normal New Yorker going about my day! email: swancreative@protonmail.com