Why Non Biased AI Doesn’t Exist

Therese Mannheimer
Grace Health Insights
5 min readSep 10, 2020

In the present moment nothing is more apparent than the acute need to actively counteract persistent prejudices, false information and stigma wherever we see it. And in building for a new generation of internet users we must seriously contemplate how our biases implicitly or explicitly shape the technology we leverage and what principles should guide us. Read the thoughts of Grace Health founder and CEO Thérèse Mannheimer below.

Automating manual tasks to enable scale is no small feat. We know this. But when dealing with health data this is an even more daunting task and to really make sure we are on top of our game, we talk often and problematise around how to manage the implications when transferring tasks from a human to a machine. Below is our approach at Grace Health. It’s not the truth, but an insight into our effort to keep the dialogue sober and realistic.

With a vision to improve women’s health across the world, we have our challenge cut out for us. By chatting with our automatic health assistant women are able to track and further understand their period, get friendly notifications and predictions about their cycle, plus get answers to the most common health issues. Coming up we will also provide access to medical assistance in the privacy of her own phone and to connect the dots, products and services all the way to her door step (last month we launched pharmacy delivery for our users in Accra, Ghana). When reaching out and hoping to connect with, a global market of women, scaling is of the utmost importance. This is why utilising tech — in our case AI — to reach as many as possible is a no-brainer. Easy in theory, difficult in practice.

Now to our point.

People in the West have a tendency to view their perspective as the right one and the way everyone should live, which of course is ignorant and wrong. However, there are perspectives and ideologies that could be useful for markets to adapt from each other, and our stance is that a more liberal approach to education and rights around sexual and reproductive health is one of them.

Let’s get back to the title of this piece, “Why non biased AI doesn’t exist” So what is even ‘non-biased’? The term non-biased literally means not biased. In short, neutral (as in not taking sides) whereas non-biased means completely free from bias. To be unbiased, you have to be 100% fair — you can’t have a favourite or opinions that would color or shape your judgment. Artificial intelligence (AI) on the other hand, is an area of computer science that emphasises the creation of intelligent machines that work and react like humans, replicating behaviour such as problem solving, reasoning, perception and planning. All traits that humans hone over time, largely due to our experiences and notions. Also known as bias.

Machines can be taught to act and react like humans only if they have abundant information relating to the world. Artificial intelligence models must be given access to objects, categories, properties and relations between all of them to implement knowledge engineering. Here’s where I argue that the bias slips in.

IBM states on their website “AI systems are only as good as the data we put into them. Bad data can contain implicit racial, gender, or ideological biases. Many AI systems will continue to be trained using bad data, making this an ongoing problem. But we believe that bias can be tamed and that the AI systems that will tackle bias will be the most successful.” and we agree, with the emphasis on tamed.

When a service or machine is developed by a human, you automatically transfer not only biases and prejudice, but you also transfer your whole value base and perception of the world. What is right? To whom? When? Where? When you automate human behaviour it is almost impossible to not teach it to mimic human behaviour and deduction principles which then also implies bias.

Well, is bias always detrimental? We don’t necessarily think so. Bias and predisposed opinions skew the way we make decisions, but they also give us a framework for how to make sense of the world. We’re not necessarily trying to answer the ethical question but rather shine a light on the complexity and potential of using AI to replicate humans, and why the discourse is needed from time to time.

We base our company and product on three key ideas:

Every person has the right to make informed decisions for herself

Every person has the right to love who they want to

Rape or violence is never ok. It’s criminal and should be reported

This is basically predisposed opinions on what is right and wrong, meaning bias.

In the case of Grace Health, we want the woman — HER — to decide what will happen to her and not. To choose who she trusts and who she doesn’t. This is, unfortunately, not the standard for millions of women around the world and the sheer thought of it can be perceived as foreign. Sometimes even uncomfortable or alien.

This is OUR paradigm. It is the foundation on which we make our decisions. It is the reason why we exist, a belief system if you will. It informs and helps us decide what content to write and what features to create. This belief is based on a set of “rights” and “wrongs” and therefore, biases. And since we are creating a service using a machine learning model that bases its assumptions on this truth, it is inherently biased. The difference here is that it is chosen and conscious, not random or incidental.

Uncomplicated? No. Inserting this bias brings along other types of negative and involuntary bias based on norms; e.g. the hetero norm, the couple norm, the freedom norm etc. and this is a continuing part of our challenge. On top of this, we always need to question ourselves and the values around us.

Building the first digital women’s health clinic for the next billion users, Grace Health does not only want to enable access for women of today but for the next generations to come.

Interested in reading more about our work on bias and AI? We’ve been working with some of the experts in the field on a project founded by the Swedish Innovation Agency, Vinnova — read more here.

--

--

Therese Mannheimer
Grace Health Insights

Delivering access to women’s health at scale as CEO and founder of Grace Health