Managing Data Bias in AI Technology

Sandeep S Kumar
Razorthink AI
Published in
3 min readNov 30, 2018

Artificial Intelligence (AI) is often considered to be a tool of the powerful and privileged, and with chatbots and voice assistants being almost exclusively “female”, they present as servile to human needs. They are becoming smarter by the day, and more efficient by the hour — Google Assistant can book you a haircut by making a telephone call, and Siri is a font of information. But as AI moves past the real of personal or virtual assistant and into the world of predictive analysis and decision making, the data we feed the machine becomes more important than ever.

Garbage in, garbage out is the motto, and bad or inaccurate data may seem an easy thing to filter. However, unacknowledged biases can affect how AIs evaluate and weight different data points, and this is nowhere more evident than in sectors controlled by specific demographics that may exclude or downgrade others.

Biases in workplaces and in the worlds of tech and finance can cause data to be skewed going in, and this can cause bias to be perpetuated in the results the AI spits out. With adoption rates of AI in businesses having grown by 60 percent in past year, identifying and mitigating this “Data Bias” must be a priority to maintain trust.

Gender and racial biases are being continually addressed and attempts made to remedy them in workplaces across various sectors. Businesses are now taking steps to address these issues, but the perpetrators of bias are still, in many cases, the ones programming the AIs, and even though it’s not deliberate, bias still leaks through.

Technologist Kriti Sharma recently pointed out that the first wave of virtual assistants reinforced sexist gender roles: the “personal assistant” roles such as Apple’s Siri and Amazon’s Alexa have female voices, but “problem-solving” bots like IBM’s Watson and Microsoft’s Einstein have male ones.

Bias didn’t stop there:

Data is biased. The lack of women in senior positions in the tech industry is an obvious go-to; only around 5 percent of leadership positions in the tech sector are held by women. Women of color are even less likely to be in power positions.

In the world of finance, bias can unconsciously sneak in and cause the AI to weight factors unfairly, leading to declines in loans or mortgage approvals for certain types of workers, ethnicities, and age groups.

In the job market, AI image recognition software deluged with gender stereotypes may assume from images on offer than cleaning and cooking are women’s work, while executive or coaching positions are better served by male candidates. This can affect job search and job matching services.

Biases can be removed, but only if those doing the programming can be trusted. This means roles for underserved demographics in programming are more vital than ever.

--

--