Coded Bias Movie Review

Jeffrey Jia
2 min readDec 9, 2021

--

“Coded Bias” is a documentary directed by Shalini Kantayya that exposes the inequality and bias within facial recognition programs. This bias is particularly apparent when the facial recognition programs are not used on people like the white men that developed it. The documentary goes in depth on discussing the drawbacks of facial recognition program that include things like invasion of privacy and the spreading misinformation about people of color. Joy Buolamwini decide to create the Algorithmic Justice League due the the personal negative experiences she has had with facial recognition AI. She discovered a fatal problem in facial recognition when she noticed an AI did not recognize her face, but it did recognize her face when she put a white mask on. Facial recognition bias is not only a problem in the US and the film explores other countries plagued with bias in facial recognition programs.

Photo by ThisisEngineering RAEng on Unsplash

One thing I learned that I did not previously know from this film is that there was such a bias in facial recognition programs. As a computer science major, I have actually had homework assignments where I create a simple version of a facial detection program. In this class, we never learned about the inherent bias that exist in these advanced technologies and I would have never guessed the problem was of this magnitude. I hope more of an emphasis on some the ethical implications of technologies would be talked about in my computer science classes since I really think the issues like bias in facial recognition programs can be prevent at the root if the current generation of developers are aware of all the potential harm and inequality their programs can cause. The film really made me question why almost no computer science programs in colleges have requirements on taking a ethics course to learn about the ethical implications of AI and other technologies.

Something I found very interesting about this film is that it explores facial recognition bias not only in the US but also other Countries like England and China. I found it very sad that some people in London are improperly identified as criminals due to bias and issues in facial recognition cameras. I also think it was interesting to learn that China uses AI and facial recognition programs to track their citizens and keep tabs on their actions and activity. I knew China had technologies to track licenses plates to determine bad and illegal driving and distribute tickets based on that but I was not aware they also used facial recognition to track individuals and determine punishments based on their daily activity.

--

--