Exploring the Tech Risk Zones: Al Bias

Omidyar Network
Sep 29 · 4 min read

By Kacie Harold, Omidyar Network

Image for post
Image for post

Earlier this year, we released the Ethical Explorer Pack — a toolkit designed to help tech workers spark dialogue about the future impacts of their designs, identify early warning signs, and brainstorm positive solutions. This week, we are sharing some additional, practical advice from leading experts on how to mitigate against several of the tech risk zones (or potential downsides) and ensure technology is safe, fair, and compassionate.

Safiya Noble is an Associate Professor at UCLA who specializes in algorithmic discrimination and the ways in which digital technologies reinforce or compound oppression. She co-directs UCLA’s Center for Critical Internet Inquiry, and her book Algorithms of Oppression: How Search Engines Reinforce Racism has been invaluable for us in understanding how tech is implicated in a variety of civil and human rights issues.

Professor Noble, what advice would you give to technologists who are just starting to think about whether their AI systems might be perpetuating harmful biases?

Read. At the Center for Critical Internet Inquiry, we have a list of the top 15 books at the intersection of racial justice and technology. People can get educated on their own; they don’t need to go back to school. Managers can make these required readings and give people time at work to read.

Normalize these conversations, and give people common books and articles to talk through together. When you are just starting, it’s good to establish common vocabulary and points of reference because it’s difficult to learn when people are talking from different knowledge frames. The Ethical Explorer (Tech Risk Zone) cards are a great resource for this; teams can bring a different question each week to discuss at a brownbag lunch.

Bring in experts. We know that outside of the workplace, broadly in society, we do terribly with conversations about justice, race, gender, sexuality, power, and class. We are so unclear about what we are talking about when we have these conversations. Sometimes it’s also easier to hear these things from someone outside of your team. It is unfair to put the onus of leading these conversation on the only women or people of color on your team.

Get the C-suite connected, and signal that this is a priority. Don’t put it on the lower level managers; there has to be a commitment from the top.

Are there any common roadblocks where people or teams get stuck when talking about AI bias or ways their technology may perpetuate discrimination?

From my experience it’s often non-programmers on the team, such as UX, who bring these issue forward and recommend solutions. However, on teams, people who do not do the actual coding often subordinate to those who do.

As a manager you have to build teams that allow the best ideas to rise to the top, prioritizing collaboration and equal power of different kinds of expertise. People with graduate degrees in African American Studies, Ethnic Studies, Gender Studies and Sociology–people who are deep experts in society–should be on these teams and hired as equals so they are co-creating as equals and there is not always privileging of the programmers’ point of view. Establishing this kind of camaraderie helps us to let go of the limits of our work and be more open to improving it.

I think it’s hard for people to grasp the ways the technology they build and use may impact their lives in the future. How do you get people to remove themselves a little bit from the moment of excitement of “Oh, we can do this” to step back and ask, “Should we do this”?

Imagine what it is going to be like when everything you have ever done on the Internet gets run through a series of algorithms that decide your worthiness to participate in certain aspects of society. That is coming, and it is already happening. Banks are already assessing applicants and their credit worthiness based on their social networks. The difference will be that 20 years from now, children will be born into those systems right from the beginning. So if you are born into a caste system or born working class or poor, that is the social network you will inherit. This is frightening. We must acknowledge that building blocks for that future are being developed today.

Are there any promising developments that you are seeing around mitigating bias and discrimination caused by AI?

I think we are entering a new paradigm of concern about harm. Knowing that a decade ago we’re not in that place, and that now we have normalized these conversations and so many people are invested in talking about harm and the dangers (of technology). That in itself is really big to me.

It’s kind of like when a new moral code is introduced to a large enough dimension of society that you can create leverage for a different possibility. One thing we have to do is get a critical mass in the workforce, on all of the teams, who can talk about these issues.

We often think of change as something that happens when one great leader comes along to marshal it. But I don’t think that’s how change happens. I think we should be hopeful that we can make change and that every conversation we are in matters. It can be a product team of five or six people that bring something into the world in a very big way, let’s not underestimate the power of these teams.

You can find Safiya’s book, Algorithms of Oppression: How Search Engines Reinforce Racism, here. Follow her on twitter @safiyanoble.

A social change venture investing in the creation of more inclusive and equitable societies.

Omidyar Network

Written by

Omidyar Network
Omidyar Network

Written by

Omidyar Network is a social change venture that reimagines critical systems, and the ideas that govern them, to build more inclusive and equitable societies.

Omidyar Network

A social change venture investing in the creation of more inclusive and equitable societies.

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store