3 ways your service could be racist

Caroline Butler
Spotless Says
Published in
6 min readSep 23, 2020

Over the last few months, we’ve experienced global political upheaval that have made issues of race more prevalent in the public’s awareness than ever before. I have seen friends, colleagues and celebrities struggle with the heavy burden of communicating and evidencing that racism takes many forms, both explicit and implicit.

We’ve been brought up to believe that racism is an event, not a system.

Mostly, the misunderstanding lies with us being brought up to believe that racism is an event, not a system. That being racist is to be consciously and intentionally mean to someone based on the colour of their skin or their ethnic background. That racism is done by ‘bad’ people. This understanding protects the system of racism and fails to acknowledge it’s actually the structures in society, it’s services, policies and products that are made by ‘good’ people. This causes harm at scale and continues to perpetuate racism, protecting those that benefit from the status quo.

We need to expand our privilege of educating ourselves about racism and actually change systems and services that maintain the status quo. Photo credit: James Eades

Giving tangible examples of systemic racism can be difficult. Not all racial inequality is equal. Racism is felt on many levels and not everyone’s experience is the same. Ethnicity, religion, skin colour, gender, class, education, income, place of residence, level of English, country of origin, the country’s history and visa status are contributing factors that can make one person’s experience radically different to another.

As designers and researchers we need to be aware of the systems we work within, and how we help organisations perpetuate this system. We need to educate ourselves, so we can consciously work towards reducing and eliminating the problem.

To begin this education for me and perhaps hopefully you, I’ve compiled 3 examples drawn from the experiences and use cases documented by others and from my own experience of navigating services on behalf of my family.

1. Your service doesn’t listen

Being able to use a Google Home or Amazon Echo device would be a game-changer for my elderly mother. During lockdown and in more normal times, she needs an easy and accessible way to order essential items. However, she struggles with touch and voice-enabled devices. She has weak eyesight, suspected Parkison’s and speaks English with an accent that reflects her Mauritian heritage.

Many people with a range of different global accents face barriers using voice-enabled interfaces. Some users can adapt their behaviour; by being somewhere without background noise, talking slower and louder, lowering their pitch and enunciating words to be more clearly understood. Those that cannot adapt, become excluded from accessing useful functionality, information and essential goods.

Speech recognition is becoming more prevalent — but are the systems designed for all voices? Photo credit: Fabian Hurnaus

As speech recognition is becoming more prevalent in cars, job hiring, immigration, law and healthcare, there is potential for users to face greater inequality or harm through not being understood. The Guardian reported a story where an Irish woman failed to convince a machine that her spoken English was good enough to meet the criteria to permanently live in Australia. A Stanford study reported that Amazon, Google, Apple, Microsoft and IBM’s speech recognition software made nearly 50% more errors when transcribing African American voices compared with white American voices of the same age and gender and using exactly the same words.

There has been a lack of diversity in the training data for speech recognition software. A.I has failed the intelligence test by not learning the complexities of language, that includes gender, age, accents, dialects, voice patterns, speech impediments and sickness like colds and sinus problems. Professor Sharad Goal who oversaw the Stanford research recommends independent audits are conducted to regulate the tech giants so they are creating software systems that and products that are accessible and safe for all.

2. Your service judges the past

The ‘Windrush’ generation got caught up in a system from today that judges people from the past. This generation arrived from the Caribbean from 1948–1973 and were deemed British subjects. Little or no documentation was required and children travelled on their parents’ passports.

Jump to 2012, and the Home Office launches a policy aimed at making the UK hostile for undocumented migrants. The devastating outcome of this policy was families that had resided in the UK for 40–60 years were torn apart by deportation, detention and denied any legal rights.

Government services may be evolving to be joined up but still have a long way to go, as demonstrated by the Home Office policy. Many straddle different eras, different government departments, with limited channels on how to achieve a task.

An example of a multiple era Government service is the double joy of navigating inheritance tax and probate. Mostly this can be done online, if the eligibility criteria is met. If the criteria is unmet, the user gets pushed into the unhappy path of filling out the much longer, more complex, more taxing (excuse the pun) and time consuming paper form, requiring tax expertise and a lot of patience to fill out.

You can get screened out from using the shorter digital form by failing to meet this one criteria point: The deceased and the deceased’s spouse or civil partner has not always been domiciled in the UK.

The question is designed to uncover the tax implications of having lived abroad, but there is no guidance on which countries this applies to, how long you need to have been living outside the UK, what the cut off time period is? Does this criteria apply equally if you’ve spent the last 6 months or 65 years working, paying taxes and raising your children in the UK?

3. Your service has been designed by default

Companies and institutions have been very slow to appreciate and celebrate that humans come in varying skin tones and with different hair texture. White skin has been the assumed default.

Tesco started different skin tones plasters in February this year. In some public toilets, automatic taps and soap dispensers are unusable for those with darker skin tones, because they are unable to trigger the sensor.

The first emojis were released as part of an iPhone software update in Japan in 2008. In 2012 , emojis became more mainstream with the release of iOS 6.0. People of colour emojis were launched in 2015.

The first emojis were released in Japan in 2008. People of colour emojis were launched in 2015.

The implications of not designing for all skin tones and appearances can manifest themselves in limiting education and career opportunities. Author of the book Don’t Touch My Hair, Emm Dabiri gives examples of black children being excluded from school as their hairstyles do not confirm with school uniform policy. The alternative being spending hours grooming their hair to meet European hair standards.

The biggest challenge now is how to increase diversity not only in front of the camera but behind it.

Test cards created by Kodak, featured white female models set with blocks of colour: grey, blue, red and yellow. These became known as ‘Shirley cards’ and were used to calibrate light, shade and skin tones, providing a way to ensure good results in photography, film and printing. The widespread practice of using these cards has created a technical bias for white skin, black skin often being under or overexposed. Whole industries like fashion, film and TV have evolved around white models and actors, resulting in stories and imagery dominated by white culture.

Technical advances, adapting processes and make up tricks means that the technical obstacles can be removed, black skin can be shown in it’s best light. The biggest challenge now is how to increase diversity not only in front of the camera but behind it.

So where to begin?

As these three examples demonstrate, racism is deeply rooted in systems, services and products. We can prioritise a wider representation of users and have more diverse design teams, but most importantly we need to be comfortable with having uncomfortable conversations to help eliminate our blind spots. Once we are aware of our ignorance, it’s what you do next that matters most. As in all good system and service design, it’s about error recovery. How we recover — is how we make progress to making meaningful change.

👏 if you enjoyed the post!

Want more? The Spotless team shares more insights on the blog Spotless Says.

--

--

Caroline Butler
Spotless Says

Creator of meaningful interfaces, experiences and services