Data Helps Remove Bias, But What About Algorithmic Bias?

Divya Gupta
Marketing in the Age of Digital
4 min readApr 19, 2020

As marketers, we learn the importance of data every day and are surrounded by it every waking hour. We are data-obsessed and are reliant on it for our plans and results, rather heavily. I for one, strongly believe that data is the ultimate game changer. From what a customer is looking for, to behavioural trends, to campaign engagements, to brand awareness, everything can be essentially decoded with the help of the right set of data.

Human biases are contagious. Can it be transferred to algorithms as well?

The harsh truth is we humans are said to be a basket of biases built over our life. Aren’t we?

But did anyone ever consider biasness at the source? Can algorithms have biases encoded in them too?

For the longest time, algorithms were presumed to be neutral and objective. The most shocking discovery was when it came to light that biases are framed into algorithms that make them discriminate in an automatic and hidden manner, making it harder to be discovered.

The problem doesn’t lie in the data, it lies in the algorithm!

As the name suggests, Algorithmic bias is a set of repetitive default errors in the underlying codes that result in unfair outcomes, for instance privileging one arbitrary group of users over others. It has been only recently addressed in legal frameworks, like the European Union’s General Data Protection Regulation in 2018.

Human’s inherent prejudice can easily seep into the way they frame and write the algorithm’s codes. Thus, Artificial Intelligence softwares contain reflections of human bias in their own interfaces. It doesn’t seem so surprising now, does it?

“What algorithms are doing is giving you a look in the mirror. They reflect the inequalities of our society.” Sandra Wachter, an associate professor in law and A.I. ethics at Oxford University

One of the most worrying revelations has been about alleged biases built into the algorithm that Apple uses to calculate creditworthiness for it’s credit cards — which gives higher limits to men as compared to women with the same financial background. To the extent that it was being investigated by the New York State Department of Financial Services. Furthermore, AI services from both Amazon and Google have failed to recognize ‘hers’ as a pronoun, but correctly spotted ‘his’.

It’s better to uncover algorithm bias than to knowingly ignore it.

These biases shouldn’t exist in the first place but then again ‘to err is to human’.

Thus, if and when you find inconsistencies within the data they shouldn’t be ignored. Once uncovered, quick measures should be taken to rectify these mistakes. Of course, it’s not that easy. Constant developments are happening in this field to rapidly find and remove such bias and also increase transparency and accountability for how companies use algorithms.

Every dataset has drawbacks!

It is very essential as to how we use the data. My advice is to watch out for data to real market arbitrage, and keep an eye evaluating for inaccuracies, when compared to the real-world repercussions of your data. Thus, keep updating the algorithms in respect with existing flaws (if any) and upcoming technological advancements.

For instance, in 2015, Amazon was faced with allegations that they discriminated against people of colour in terms of their same-day delivery services. Amazon was completely unaware of this situation and in complete honesty issued a statement — “We don’t know what our customers look like”. The company selects same day delivery areas in respect with cost and benefit peripheries. But what they failed to acknowledge was that the algorithm doesn’t take into account certain ZIP codes, thus only catering this service to the affluent or certain people of colour.

The business world is actively utilising automated decision making, by employing big data and artificial intelligence as the core factors. But we can’t just blindly rely on any set of data. This is because it can have embedded prejudice or can misinterpret the behavioural and consumption trends. Thus, the first and foremost step is to do your homework right and get educated about the industry.

As marketers, we need to use data mindfully and use it not as a sole factor to make a decision but rather as a means to guide us towards an informed and educated decision. The mantra is to use the right set of data coupled with personal intuition, cultural nuances and marketplace insights. Period.

Until next week :)

--

--

Divya Gupta
Marketing in the Age of Digital

Marketer in the making || NYU Grad student || Change is the only constant 💫