Photo by The Ian on Unsplash

Facial Recognition, Racial Bias and African Law Enforcement

How the misidentification of black faces might already be affecting the civil rights of black people

Chiagoziem
Published in
6 min readJun 28, 2020

--

Many black people presume that the comment, “they all look alike to me”, is the way the rest of the world sees us.

But research has shown that, although this facial ambiguity is a problem, it’s not unique to black people.

The human brain de-individualizes faces belonging to groups that we don’t belong to. This behavior is known as the cross-race effect.

Now, if I didn’t know better, I’d have sworn that scientists were doing their best to all-lives-matter the conversation, but there is evidence that, to people of a different race, people of another race do look alike.

That said, the well-documented shortcomings of modern facial recognition technologies are forcing scientists to consider a different, but ultimately, related set of questions.

At a basic level, the goal of artificial intelligence is to simulate the human brain — General AI, or even exceed its intelligence — Super AI.

But so far, what we’ve been able to do is simulate a subset of human functions — Narrow AI.

This field of AI operates within a pre-determined, pre-defined range, and replicates a specific human behaviour based on specific parameters and contexts.

While this is no doubt an impressive feat, the cross-race effect isn’t extensible to lower forms of intelligence. What we’ve found, instead, is that the deep learning algorithms that we use in narrow AI might have inherited some deep racial biases.

Facial recognition technology has been around since the mid-’60s. But in recent years, the emergence of deep learning has accelerated its adoption in numerous use cases including law enforcement.

However, it’s still an imperfect technology.

US government tests found that, as recently as 2019, even top-performing facial recognition systems misidentified blacks at rates five to ten times higher than they did whites.

Depending on the use case, the effects of misidentification can be anything from mildly irritating — like in 2009, when an HP webcam designed to track people’s faces, tracked a white worker but not her black colleague — to deeply insulting, like in 2015, when Google Photos classified some black people as gorillas.

But in the context of law enforcement, while it has been successful in fighting crime in some locales, a single case of mistaken identity could be the difference between freedom and incarceration, or worse still, between life and death.

In what might be the first known case of a wrongful arrest caused by inaccurate facial recognition technology, Robert Julian-Borchak Williams was recently arrested in Detroit, after a facial recognition system falsely matched his photo with security footage of a shoplifter.

Robert Julian-Borchak Williams (Source: NY Times)

Williams is African-American.

His story is, coincidentally, coming at the height of tensions between Black America and the police force.

The charged atmosphere has forced multiple western tech companies, including IBM, Microsoft, and Amazon, to announce that they’ll be pausing or stopping their facial recognition work for the police.

Even though there have been advancements in the space, the margin for error when identifying black faces is still far too high. And it was only a matter of time before an incident like this occurred.

The decision of the tech giants to put their programs on hold is the biggest indictment on current facial recognition systems. The reason why the technology performs so differently for darker skin tones is still unclear, but there are at least two plausible explanations:

1. Black people are underrepresented

MIT researcher and digital activist Joy Buolamwin has made racial biases in facial recognition technology her life’s work. She’s put forward the theory that, in the datasets used to test or train facial recognition systems, black people are not properly represented.

An AI system is only as good as its data. Respected AI researcher Robert Mercer famously said:

“There’s no data like more data.”

The easiest place to “harmlessly” harvest large amounts of photos of faces is the web. Being the largest contributors to the global internet economy, online content tends to skew very male, very white, and very western. It also doesn’t help that it’s that same demo that’s largely responsible for building western AI algorithms.

2. The black photogenicity deficit

There’s another argument that the lower accuracy on darker skin can be traced back to the beginnings of color film. Photographic technology has always been optimized for lighter skin, and the digital photography we use today is built on the same principles that shaped early film photography. According to this school of thought, narrow AI is having difficulty recognizing black faces simply because modern photography wasn’t designed with the facial features of black people in mind.

In western countries where blacks are in the minority, these built-in biases significantly impact the quality of facial recognition-assisted law enforcement. However, in the continent with the largest population of black people, the potential for harm is exponentially greater.

The US and China are locked in a war over AI dominance.

According to renowned AI researcher and investor Kai-Fu Lee, there are 7 giants of the AI age, namely:

  1. Google
  2. Facebook
  3. Microsoft
  4. Amazon
  5. Tencent
  6. Baidu
  7. Alibaba

Currently, it’s almost an even split between the US companies and the Chinese companies. Some analysts believe that Africa could be the final battleground.

If so, then it’s a battle that US businesses are currently losing.

There have been interesting, one-off developments, like Google opening its first AI lab in Ghana last year, but the US has largely been cool on exploring the continent’s AI and data potential.

This has handed China a significant advantage, particularly in face recognition.

In recent years, the Chinese tech giant, Huawei, has been pushing its flagship public safety solution: Safe City. Built on CCTV and facial recognition technologies, the solution provides local authorities with modern tools for law enforcement.

According to the Center for Strategic and International Studies (CSIS), a US-based think tank, there are currently twelve Safe City programs operational in sub-Saharan Africa, including in Kenya, Uganda and South Africa.

Questions have been raised about privacy, data protection, and aiding and abetting authoritarian regimes. But, on the other hand, there have also been success stories. Like in Nairobi, where Huawei claims that the initiative led to a 46% reduction in the crime rate.

It’s, however, instructive that information about false positives and wrongful arrests has, so far, been opaque.

In 2018, Chinese AI startup, CloudWalk, signed a deal with Zimbabwean President Emmerson Mnangagwa. Mnangagwa has shown a tendency to use digital tools and the power of the law to restrict civil liberties but that’s not the only thing troubling about the CloudWalk deal.

As part of the agreement, Harare has been sending data on millions of black faces to the Chinese company, this is helping to train their technology toward darker skin tones.

It’s a brazen data-for-dollars swap on a national level.

The CloudWalk-Zimbabwe agreement offers a glimpse into the deficit in global facial recognition technology that Chinese companies are trying to make up. These companies are benefiting from the general absence of laws that cover biometric data and cross-border flows of sensitive information.

As Chinese AI companies continue to support local law enforcement, and conduct business with oppressive regimes, while simultaneously using black faces to train their datasets, there’s no telling how many Robert Julian-Borchak Williams there had been before there was Robert Julian-Borchak Williams.

Subscribe to the get.Africa newsletter, a weekly roundup of African tech in a language you’ll understand. A fresh email drops every Monday morning.

--

--

Chiagoziem

Solutions Architect | Subscribe to 📬 https://get.africa, my weekly newsletter on African tech