Coded Bias, a review.

Eli Martinez
2 min readNov 28, 2023

--

The image has been created to visually represent the concept of a dystopian city with pervasive facial recognition surveillance.

The documentary Coded Bias is about exposing the biases that are present in algorithms and data that in turn affect Machine Learning and A.I. systems such as facial recognition. Joy Buolamwini is a computer scientist who discovers that facial recognition often fails to identify people who have darker skin while working on a project at MIT. After experiencing this, she advocates for accountability in A.I. and those developing and training systems in facial recognition but also those who create algorithms to determine credit, housing, employment, healthcare, education, legal systems, and more which affect minorities negatively.

I learned about just how much people, specifically people of color, are affected by A.I. biases. These are some of the most impactful decisions of our lives, credit, housing, employment, education, legal systems, etc. and when they are unfairly put into place with algorithms that operate disadvantagous to these people — it’s disturbing. What’s also disturbing is how hard it is to change these systems. The actual algorithms and data used for training are not transparent and companies admitting their wrong is not easy, making it especially frustrating to tackle this issue. I’ve gained a whole new sense of understanding how deep this problem goes and how important it is to address these biases and strive for change.

One of the most interesting scenes in the film was when they showed the controversy of using facial recognition unfold right before us. The UK police parked a van in a busy, public street with cameras on top using facial recognition to attempt to identify criminals walking around. First off, this is scary that this is happening right before our eyes, it poses so many privacy risks now that police or any other law enforcement can just set up cameras and track/store people’s faces. It doesn’t seem acceptable to do so and breaches your rights. It’s even scarier that when one person realized what was going on and covered their face when walking by the van, police immediately confronted him and he ended up being fined for this. There’s no protecting your identity and privacy anymore. This was a really eerie sight to see and leads me to wonder if that is happening to me when I walk in public, but just don’t know it because of the lack of transparency. Without many laws regarding facial recognition, are governments and law enforcement required to make it known that you are walking into a public/private place using facial recognition? Are they able to use facial recognition legally at all? Will it eventually be normal to constantly be tracked and surveilled by these technologies even though the accuracy may never be 100%? Who decides whether facial recognition is accurate enough and what amount of error, if any, is deemed acceptable?

--

--

Eli Martinez
0 Followers

University student pursuing computer science.