Race: The elephant in the room that never goes away
By Tracey Gyateng, Data Science Manager at DataKind UK
Biologically, race — the shared physical traits seen by humans- does not exist within the human genome. It is something we observe, and should be no more fascinating than observing someone’s eye colour. And yet, we are unable to discuss race as neutrally as eye colour. Bound up within race is the belief- by some- that certain races are more superior than others and this belief has generated wealth for some, and prohibited it for others. It is a belief that directly and indirectly manifests itself within our society, so it is not surprising that it surfaces within technology- or is it? And If our technologies encode the biases of society, what are we, as data scientists, expected to do about it? DataKind UK’s ethical principles states that we should at least discuss it, so on a grey rainy night in London, and online, we discussed race and AI (see our reading list below).
It was not without caution that we decided to acknowledge the elephant in the room. Discussions focussed on race can be contentious. The opening paragraph of this blog is disputed by some (racists) who believe there are genetic differences between races (and my calling these groups racists would also be an area of contention!). And yet we had an open (and civil) discussion with over 30 attendees. The remainder of the blog touches on ideas & topics that were discussed.
A range of cases were discussed on the negative impacts of tech on race (and this has been a feature of our previous six book clubs) (1). Safiya Noble’s Algorithms Of Oppression, which explored how search engines can be sexist and racist, cases from Cathy O’Neill’s Weapons of Math Destruction, or Facebook’s targeting of job adverts against ethnic minorities and genders even when advertisers didn’t specify this. And these stories are not just confined to the US. In the UK the Home Office rolled out facial recognition for checking passports despite knowing that there were higher error rates for black people. Given that facial recognition is being allied to surveillance, this may be an area in which unreliable technology is welcomed by over-policed groups! (See Ruha Benjamin for more discussion).
By discussing these stories, the conversation naturally led to what action was needed from the tech sector. One book clubber remarked ‘we don’t yet have the vocabulary to talk about race, like we can talk about class or gender’. So can increasing tech workers’ racial literacy contribute to less harmful technology? Some book clubbers liked this idea of racial literacy as proposed by Jesse Daniels, Mutale Nkonde, Darakhshan Mir, but considered that it needed to be taught everywhere, as part of the school curricula and encompassing the effects of colonialism. Others were not satisfied about focusing on children, and that we needed more immediate work. The composition of tech teams should be diversified to reflect society, and support provided to ensure that minorities/marginalised groups are empowered, listened to & action is taken when issues of unfairness are raised (and don’t get us started the definition of fairness!).
Lastly the conversation looked at non tech workers- the warehouse workers, deliverers, content moderators, etc who are often on short term, precarious and low paid contracts. How can we improve their working conditions? Government regulation was the commonly agreed answer. It isn’t sufficient for the tech industry to regulate itself, nor can it be fair and sustainable that the income inequality gap continues to increase.
Get involved!
At DataKind UK we hope that by having these conversations we empower book clubbers to bring these discussions into their work places, and to be part of the change that is needed. To find out more about our book club, keep an eye out on our Eventbrite page. We hope to see you at our next #DKBookclub!
Our reading list
We provide a range of text to enable both time poor and rich attendees to contribute to discussions. The following were the recommended text for our race and AI discussion.
- Main text: Ruha Benjamin, Race After Technology.
- Journal Article: Sebastian Benthall & Bruce D. Haynes. Racial Categories in machine learning
- Quick read: Jessie Daniels, Mutale Nkonde, Darakhshan Mir, Advancing Racial Literacy in Tech- Why ethics, diversity in hiring and implicit bias trainings aren’t enough
- Quick read: Karen Hao, This is how AI bias really happens — and why it’s so hard to fix. MIT Technology Review
- Poetry: Poem performed by Joy Buolamwini, AI, Ain’t I A Woman?
- Talk: Safiya Noble, author of Algorithms of Oppression provides a short discussion of her book
- Comedy sketch: Full Frontal with Samantha Bee. Correspondant Sasheer Zamata discusses bias
(1) For example, our second Book club discussed Facial Recognition Technology. Read about all our book clubs here.