Amazon Facial-ID Software Used by Police Falls Short on Accuracy and Bias, Research Finds

The new research is raising concerns about how biased results could tarnish the artificial-intelligence technology’s exploding use by police and in public venues, including airports and schools

Washington Post
The Washington Post

--

Facial-recognition technology is demonstrated during a consumer trade show in Las Vegas in January. Photo: David McNew/AFP/Getty Images

By Drew Harwell

Facial-recognition software developed by Amazon and marketed to local and federal law enforcement as a powerful crime-fighting tool struggles to pass basic tests of accuracy, such as correctly identifying a person’s gender, research released Thursday says.

Researchers with M.I.T. Media Lab also said Amazon’s Rekognition system performed more accurately when assessing lighter-skinned faces, raising concerns about how biased results could tarnish the artificial-intelligence technology’s use by police and in public venues, including airports and schools.

Amazon’s system performed flawlessly in predicting the gender of lighter-skinned men, the researchers said, but misidentified the gender of darker-skinned women in roughly 30 percent of their tests. Rival facial-recognition systems from Microsoft and other companies performed better but were also error-prone, they said.

--

--

Washington Post
The Washington Post

News and analysis from around the world. Founded in 1877.