Mirror, mirror on the wall, who’s the fairest of them all?| Unequal Algorithms

Raahil Rai
Roti, Kapda aur Data
3 min readSep 18, 2020

Facebook takes a step towards algorithms without racial bias in the US. India should be next.

Image from https://fossbytes.com/amazon-scraps-ai-hiring-software-biased-results-women/

Six decades ago, Harper Lee’s ‘To Kill a Mockingbird’ was published. Through the eyes of a child, she held a mirror up to a society and showed us one rife with racial bias. Although this one’s a must-read, you don’t need a novel to show you that today.

Let tech be your mirror. An investigation by ProPublica found that sentencing algorithms are more likely to recommend a higher punishment for black convicts than for white ones — not too dissimilar to Harper Lee’s courtroom. A group of researchers found that rentals with non-black landlords are priced 12% above black ones for similar rentals on AirBnB. And a group of Facebook employees alleged that black users on Instagram are 50 percent more likely to have their accounts disabled for violations.

We believe that people are fundamentally good. How’s that consistent with the data above, you say? Stereotypes. We all have them. Picture a nurse or receptionist. Did you see a woman? Now think of a surgeon.
And why is it that poorer characters in films have darkened faces? Think Hrithik Roshan in Super 30 and Ranvir Singh in Gully Boy. In one experiment, researchers sent CVs with exactly the same qualifications to potential employers, but with one difference — one set had typically white names and another typically black names. Guess which set received more interview calls?

People stereotype without knowing it. Algorithms observe people’s behaviour. Algorithms inherit stereotypes.

Cue Facebook — they will set up teams to study whether their algorithms are bias. They will investigate whether African-Americans, Hispanics and other minority groups are systematically treated differently — do they see different ads and content suggestions? Do they have their content pulled down unfairly more frequently?

We encourage Facebook and others to wipe our mirrors clean in India too. We know little about whether many vulnerable groups online have a fair experience online. All we know that Dalits have the lowest presence on social media, regularly face harassment and slurs online, can be deliberately excluded from ads, and often complain that social media companies are not responsive to their complaints.

But why haven’t Big Tech firms addressed bias in their Indian markets already? Liz Carolan hypothesized that social media firms prioritise their commercial interests in higher-value US customers, than their obligations towards the many more but less well-off Indian customers. That doesn’t need to be true. In fact, there is business value in building a brand that makes the experience of everyone on the internet equal.

In addition to BigTech, we can all take steps forward. More researchers can study the experience of vulnerable groups online. Product managers and engineers can consult tools like the ethical explorer toolkit to avoid bias in building their products. And each of one us can recognise our once-invisible stereotypes. I’m starting with the man in the mirror.

--

--