Bias Manifest

Y-vonne Hutchinson
Project Include
Published in
5 min readAug 24, 2016

A few weeks ago, I wrote this piece on bias in product design for MIT’s Technology Review. I opened it by sharing my experience with Snapchat — mainly that I was too black to use it. Since the time of writing, more controversy around the platform has emerged, and I decided to take a deeper dive into the ways in which bias manifests in product design and development.

Today, we recognize that the tech industry has a problem with unconscious and conscious bias. Since diversity numbers were first released in 2014, major tech companies have had shown little improvement. Diversity advocates contend that this is because those companies have yet to make diversity and inclusion a priority.

The long term business case for diversity is well established. Diverse teams perform better over time. However, in the rapidly moving field of technology, we are also discovering the short term case for diversity. Companies need people who can identify blind spots. The more homogenous the team, the bigger the blind spot. Companies which exclude are in danger of building products which exclude.

How does bias manifest?

As I see it, bias in products, platforms, and hardware manifests in several key ways:

Exclusion: A company can unintentionally exclude current and potential users from engaging with its product. This was my experience with Snapchat. But Snapchat is not the only tech company that has faced this challenge. Last April, Amazon came under fire when it was discovered that their next day delivery service excluded black neighborhoods. Such exclusions have also started to emerge in the world of hardware. As virtual and augmented reality technology gains prominence, women have begun to notice that a surprising amount number of products don’t work for them, because producers have not accounted for differences in physiology.

Silencing: A company can disproportionately target users with certain identities in the way that it removes or shares their content. Social networks operate under the veil of neutrality, but where content is curated and sorted, there is a danger that bias can seep in. For example, Facebook has faced pushback on the content it deems offensive, such as posts supporting Black Lives Matter and breastfeeding, and the content it decides to suppress in its curated newsfeeds, like those from conservative sources. Twitter’s algorithmic timeline and tailored suggestions have also come under scrutiny for their lack of transparency.

Exacerbation: A product or platform can be leveraged by users to amplify or exacerbate pre-existing exclusion and societal injustice. Snapchat’s yellowface and blackface controversies are two such examples. However, on platforms where identification of the user is important and user to user interaction is high, but accountability is low, discrimination can become especially problematic. Arguably, the most prominent example of this occurred just a few months ago when AirBnb garnered national attention for its issues with discrimination. A recent study found that AirBnb guests with African American names were 16% less likely to find lodging and that hosts who were not African American could charge 12% more for rentals. However, discrimination on the platform isn’t just limited to African Americans. Transgender and Muslim users have also voiced concern.

Such bias also frequently manifests as sustained threats and harassment on online platforms. Twitter, in particular, has struggled to rein in abuse on its site. During 2014’s Gamergate, several women in the gaming industry fled their homes after receiving death threats and having their personal information tweeted publicly. A few weeks ago, African-American actress, Leslie Jones, waged a campaign on Twitter to expose the volume of racist and sexist abuse aimed at her. And just this week, a black British woman residing in Ireland, who took over the @ireland twitter account, was targeted in a campaign of sustained online harassment.

Even the software related to the criminal justice system has been found to contain imbedded bias. In 2013, a Harvard University professor found that, on Google, 92% of the ads placed alongside searches of black-identifying names suggested a criminal record. But that’s just the tip of the iceberg. In May of this year, investigative journalists found that COMPAS, a software tool used in several states to predict the risk of recidivism during nearly every stage of the criminal justice process misclassified black defendants as high risk at nearly twice the rate as it did for their white counterparts, and white defendants were nearly twice as likely to be misclassified as low risk than their black counterparts. Ultimately, when controlling for other major factors black defendants were 77% more likely to be assigned higher risk scores than white defendants.

Where are some of tech’s blind spots?

Products are made by people, and bias can become embedded in a product or platform during any period of the development process. However, there are a few critical blind spots which stand to impact their products the most.

Thinking of the User: Technology often thinks of itself as neutral and to a certain extent of its users as a same way. However the presumption of neutrality, particularly in the way that users are conceptualized is a false. Identity–neutral design approaches have a tendency to default to dominant influences and assumptions when considering the user. Recalling VR, this is likely the case where companies design products which only work on men though they are being marketed for both men and women.

Insufficient misuse case models: Those who do not commonly experience bias or discrimination may have a harder time predicting the ways in which they manifest. They may also have a harder time predicting the likelihood of manifestation. For example, AirBnb may have been genuinely shocked when the extent of discrimination on their platform emerged, if only because the majority of AirBnb employees would never predict it based on never having encountered that kind of bias in their daily lives.

Problematic Data: Algorithms are not inherently biased, however the data that they are based upon may be problematic. For example, facial recognition algorithms trained on white faces have a harder time distinguishing features on Black or Asian ones. Famously, last year, the Google Photos app came under fire for mislabeling black people as gorillas.

Unconscious bias woven into the DNA of a firm can be passed down from a team to a product and ultimately to the user. Ultimately, these blind spots have consequences. Low levels or lack of adoption, premature market saturation, emboldened competitors, and broken products can all happen when an organization’s blind spots are passed down to its products.

There is an argument to be made that diversity supports sustainability not just in the long term, but also in the short term. Nearly half of all new businesses survive their first five years. In contrast, almost 90% of tech startups fail within their first 18 months. An estimated 42% of tech start up failures can be attributed, at least in part, to a fundamental misunderstanding of the market. Though tech startups will always be more high risk than other types of new businesses, perhaps there are ways in which some risks can better mitigated, earlier on, through increased diversity.

A company which is able to build diverse teams capable of recognizing gaps in organizational thinking and empowered to call them out is much less likely to have a crippling blind spot than one with homogenous teams. An organization that sees, understands, and is able to respond to its market is more likely to thrive than one that can’t.

Special thanks to the ladies at Project Include, without whom, this piece would not have been possible. Their ideas, insights, and support continue to inspire.

--

--

Project Include
Project Include

Published in Project Include

Give everyone a fair chance to succeed in tech.

Y-vonne Hutchinson
Y-vonne Hutchinson

Written by Y-vonne Hutchinson

International Lawyer and Public Policy Expert. Labor, Tech, and Human Rights are my jams.

No responses yet