Amazon Facial-ID Software Used by Police Falls Short on Accuracy and Bias, Research Finds
The new research is raising concerns about how biased results could tarnish the artificial-intelligence technology’s exploding use by police and in public venues, including airports and schools
By Drew Harwell
Facial-recognition software developed by Amazon and marketed to local and federal law enforcement as a powerful crime-fighting tool struggles to pass basic tests of accuracy, such as correctly identifying a person’s gender, research released Thursday says.
Researchers with M.I.T. Media Lab also said Amazon’s Rekognition system performed more accurately when assessing lighter-skinned faces, raising concerns about how biased results could tarnish the artificial-intelligence technology’s use by police and in public venues, including airports and schools.
Amazon’s system performed flawlessly in predicting the gender of lighter-skinned men, the researchers said, but misidentified the gender of darker-skinned women in roughly 30 percent of their tests. Rival facial-recognition systems from Microsoft and other companies performed better but were also error-prone, they said.