History and technology: The visibility of Black faces
Given October is Black History Month in the UK, my reading this month was the book Race after Technology: The New Jim Code by Ruha Benjamin. The title is a play on words, referring to the Jim Crow era from the late 1800s to the 1970s. The Jim Crow persona was a racist and inaccurate theatrical depiction of Black people. This offensive character fuelled the Jim Crow Laws in America. These laws enforced racial segregation and legitimised racism at a local and state level. These were the same sentiments and behaviours invited Black Britons were met with on arrival in the UK from the commonwealth between the 1940s-70s.
The author defines the New Jim Code “as the employment of new technologies that reflect and reproduce existing inequities but are promoted and perceived as more objective or progressive than the discriminatory systems of the previous era.”
Decades later, it seems in today’s society we are more uncomfortable with the acknowledgement and confirmation of racism than the incidents of racism itself. Whether it’s a celebrity’s tweet or an algorithm; the word ‘racist’ is joined by its friends — the air quotes. Many question the intent of the individual and deny technology has any intent at all. It even makes it a challenge for someone like me to write a piece like this living in an era and part of the world where, on the surface, I’m granted many freedoms and possibilities. Seemingly to the same degree as anyone else. The New Jim Code describes that explicit intent of a person to harm another person is not the only way racism and bias exists. Here, I explore examples also addressed in the book of how existing and historic systematic biases find their way into new technologies and everyday life, with a UK Black experience lens.
It seems that a new article on the AI-bias and the failings of tech companies, giant and small, on coded-inequality is published every week. Subsequently hysteria and virtual outrage momentarily trend until the next offender is highlighted.
“The invisibility of a person is also the visibility of race” -Skin feeling by Sofia Samata
This paradox of being both invisible and hyper-visible is something I wanted to share with my colleagues at Onfido. As we are a company rooted in identity and innovators in facial recognition. Thus wrote this blog to explore these challenges, across the analogue and digital planes through recent history to the present.
Recognised & Hypervisible
We’re all familiar with the Big Brother narrative around surveillance. With an ever more privacy-conscious society, many people of all races are concerned with the state granted surveillance being deployed that is utilising facial recognition, despite it being proven that facial recognition for public surveillance can be wildly inaccurate. It is being utilised by law enforcement globally, including here in London (even after a poor trial performance).
Let’s for a second move to the physical plane of surveillance. False positives tracking and targeting of Black people is nothing new. In America, as a Black person, you are seven times more likely to be falsely convicted of murder than your White counterparts. Hit Netflix series When They See Us heartbreakingly details the cost of being visible at the wrong time as young Black men. Is that just America?
Unfortunately not. Shopping while Black is a well-documented phenomenon globally. As someone who has many a time been followed around while shopping for cosmetics, I’m acutely aware of what it feels like to be recognised. Despite data showing Black people are no more likely to shoplift — many reports and legal proceedings evidence the bias in profiling. The same recognition and criminal assertion by overzealous security guards is rooted in the same bias that leads their uniformed cousins; the police, stop and search Black people at a 11.8x higher rate in London (40% more likely across England and Wales). Stop and search under different guises and names is a mechanism that has been deployed on the black community in Britain for many generations. In my parents’ era it was the now discredited Sus Laws (similar to Jim Crow) that unfairly targeted Black and other minorities during the 70s-80s.
In the backdrop of the knife crime spikes in the city — Sus laws, under the new Stop and Search act (Section 60) — are credited as a ‘vital tool’ for reducing crime. Findings published by the Home Office, however, show that it is disproportionately being used to target Black people. As someone who is concerned about the safety of all young people in the city I’m pro anything that prevents violent and other criminal activities taking place. But, the report details that where an offence is detected after a stop and search, the rate of occurrence was the lowest for Black people. So how effective is this vital tool with so many false positives, criminalising the most vulnerable?
To be Black, and hyper-visible. Why recognition accuracy is so important. In many countries, fitting matching the profile or identity of a suspect in the eyes of a police officer can be fatal. For the lucky ones it could lead to arrest. Only then does the investigation of innocence (or guilt) begin. The compound effect of the over-surveillance of certain areas and groups of people disproportionately fills law enforcement databases with Black faces and names. The same databases that are being cross-checked in the virtual plane by ‘smarter’ recognition technology that cannot tell us apart. So the use and accuracy of facial recognition surveillance is not just a viral thread or a bug but built upon decades of history of unfair assertions of criminality for our race.
Recognising invisibility
MIT Media researcher and Algorithmic Justice League founder Joy Buolamwini’s publications of the ‘coded gaze’ found that facial recognition software she was researching could not detect her face due to her darker skin but could detect the faces of light-skinned classmates.
When we discuss the occurrence of Facial Recognition algorithms inability to detect Black faces, we often describe a dystopian future where driverless cars can’t see us (again more threat to our lives). Or where access to travel or finance is denied with no human intervention. This directly contradicts Utopian futures promised by the same technologies of fairer, easier experiences.
Alas, it is nothing new. It is globally presented but rarely discussed that White is the norm. Let’s pause there. It is rarely discussed. Because it is inherently deemed the standard. What shades of nude or flesh toned do you typically see in stores? Usually a pallet between peach and beige, I’m guessing. Like hosiery, plasters and lipstick; technology products are also built in ‘nude’ by default. The New Jim Code details a time before the selfies. Between the 1940s and 1970s, Shirley Cards were used by Kodak to standardise the exposure process of developing film. Today comment threads on this topic exist arguing that physics is physics, or that there was no bias but technological limitations in film processing chemistry. If I was the type to argue with strangers on the internet, I’d highlight that the absorbance and reflectance of light is indeed physics but tuning reference Shirley cards themselves are the empirical evidence that the processing was biased towards White skin. There was no scientific limitation, as Lorna Roth explores in Looking at Shirley, the Ultimate Norm: “Film emulsions could have been designed initially with more sensitivity to the continuum of yellow, brown, and reddish skin tones, but the design process would have had to be motivated by a recognition of the need for an extended dynamic range.” Simply, the commercial target market was caucasian. 1970s furniture and chocolate companies complained about the underexposure of their products rendering Kodak to deem brown tones important, still multiracial cards were produced some 20 years later.

Over the same period, Polaroid used this new capability to capture darker tones in photos for the infamous passbooks of apartheid South Africa. During this period, Black people weren’t deemed citizens of their own country. To enter White areas to work, they had to carry these state-issued passbook documents. Refusal or forgetfulness resulted in being beating, banished, imprisoned, or even murdered. It took Polaroid employees — Chemist and Photographer Caroline Hunter and Ken Williams — in America and international campaigners over 7 years of lobbying before Polaroid finally pulled out of South Africa. Note, Polaroid dismissed Caroline for taking a stand.
To be Black and invisible — why inclusivity matters Invisibility means our stories are never told, for example, Black History Month is largely ignored if not diluted into an international or diversity day/month reinforcing the idea of non-value. This leads to apathy for Black users in design thinking. Enforced by the history of imagery and capture of our likeness which has been used as a tool for criminalisation and dehumanisation for centuries.
Recognition and representation
Representative data is typically held as the solution or the blame of underperforming and biased technology. Algorithms are either tuned to nude or scrape public datasets that include the disproportionately populated mugshots of Black faces or other imagery that reinforces racist stereotypes. If hashtags like #Oscarssowhite raise the flag on the invisibility of Black actors in Hollywood until the next slave or gangster movie is released. Then who’s blowing the whistle on the number of algorithms that use celebrity data to train their Machine Learning models?
The falsehood that ‘minority’, means small margins have been kicked down by six Sub-Saharan African countries listed in the 15 fastest growing economies. Take record-breaking blockbusters like Black Panther and the sales of Rihanna’s makeup foundation range Fenty Beauty. The darker shades of Fenty Beauty’s 50 shade range sold out instantly. Long have make up companies ignored those of us with darker skin and even those that did — limited their distribution. I once found myself at a Bobbi Brown counter at Gatwick airport listening to the sales assistant explain through micro-aggressions, that while they do sell my shade they don’t stock it at the airport. I explained with a heavy sigh that Black people travel too. Soon after the launch of Fenty beauty’s foundation many other companies wanted to jump on the #BlackGirlMagic gravy train by expanding their range. A year later, it’s now standard marketing to have various shades in beauty campaigns. Perhaps now we can find our right shade and if not we can always turn to the security guard hovering nearby to help us pick it out!
So it seems, as we have explored here, that Blackness can be recognised. The power of Black Twitter has certainly been recognised. With mainstream media platforms and marketing agencies mining and analysing Black Twitter for content and ideas, returning targeting and tailored advertising and rhetoric. Technology indeed recognises colour and gathers the data needed when lucrative.
Hence why I don’t buy the lack of data reasoning. It isn’t the lack of data that caused Black people to be identified as gorillas by Google it was the abundance of biased data sets available that discriminates and dehumanises Black people. It is the fact that the coded norm is nude. It is also the fact that we are underrepresented in the development of policy, marketing campaigns and algorithms.
Recognising history in innovation. I think back to what inspired me to study science and engineering growing up. I geeked out for hours at the ability to detect, image, and measure things at nanoscale, to measure variations and perturbations even light-years away, and at the promise of big data as I conducted Monte Carlo analyses on my experiments. Yet, years later, companies held at the fringe of innovation are struggling to recognise faces like mine.
Admittedly, I found the introductory chapters of The New Jim Codea little pessimistic. It’s a great read on the hard truths, highlighting the reality of the challenges we face today. They are technological manifestations of a long history of structural and systematic coded inequality. The optimism I found from reading the book and other works cited here is the fact that through the unfairness in history there were many who stood up, spoke out and created change in the face of inequality. Just as the employees at Polaroid pressured the company. I’m hopeful that the recognition of the importance of representation and diversity in STEM industries continues. So that those who are 35% more likely to be subject to surveillance while shopping or 40% more likely to be recognised as a criminal while walking to said shop are part of the design and development of new technologies, not just the victims of it.
At Onfido, we must continue the important work we are doing in addressing bias as we develop new products. But also recognise the coded inequality that already exists in our society to achieve our vision of an open world where identity is the key to access.
