Can You See Me Now? Facial Recognition, AI, & Racial Bias

Divercity, Inc.
The Bridge by Divercity
3 min readMar 2, 2023
Screenshot of a Tweet actor Simu Liu posted about his celebrity headshot being compared to a person in the stadium that the celebrity look-a-like cam posted during the game.

The 2023 NBA Celebrity All-Star Game was held on February 17th in Salt Lake City, Utah. While Chinese-Canadian actor Simu Liu shined on the court, he was greatly disappointed with the in-game entertainment. During a celebrity lookalike segment, the videotron featured a picture of Liu next to a fan in the audience. However, aside from both being Asian, the two bore little resemblance to each other. After the game, the Marvel actor tweeted, “I had a great time but this wasn’t cool.”

He later clarified on Twitter:

To be perfectly clear, the entire org and ops team surrounding all-star have been nothing but stellar… this was just one person with a camera. And no disrespect to my man on the videotron either! He’s gorgeous we just don’t look alike.

But while Liu seemed able to shake it off, the incident brings attention to larger issues concerning facial recognition and racial bias.

The deeper issues at play

It’s possible that NBA staff were demonstrating a cognitive bias. The cross-race effect refers to the limited ability of a person to distinguish facial features and expressions among people of a different race. This occurs in any group toward another; while members of the Black community may have difficulty distinguishing Asian populations, Latinx people may perceive all white Europeans as looking alike. It takes extended contact between groups in a positive context to undo the work of cognitive bias.

As an alternative to a “person with a camera” operating in-game entertainment, some sports facilities have opted to use AI facial recognition to enhance fan experience. Given the unreliability of the human brain, would we be better off having AI do the recognition for us?

Probably not (at least, not yet).

The first step in developing facial recognition AI is teaching the computer what a face is. However, when certain people groups are underrepresented during this training, the computer doesn’t learn to identify them as well.

A blue image with cartoon figures of heads of people resembling the idea of AI generation and look-a-likes

A 2018 study found that AI facial recognition software consistently failed to correctly recognize young black women. Further investigation found that the system had been trained on databases which included a disproportionate amount of white men.

Technology is susceptible to the biases of its human developers. When underrepresentation is left unchecked, systems end up displaying signs of racial discrimination. This is a great example of why diversity matters in every facet of industry — not just out of principle, but in order to create systems that actually work.

The good news

Organizations like the Algorithmic Justice League are working toward more diverse representation and equitable outcomes within AI systems. Facial recognition software is also being held accountable, with systems like ethical auditing being put forth.

As for the human brain — while cognitive bias is innate, it’s also reversible. Research indicates that simply being aware of the cross-race effect can help reduce the tendency to falsely identify facial matches.

All of us — including NBA employees — benefit from being more aware of our own biases and more sensitive to the multiracial world around us.

Prepared by: Cassidy Mayo

--

--

Divercity, Inc.
The Bridge by Divercity

The Bridge — A blog about Diversity, Equity, and Inclusion