Are We Heading to Digital Disenfranchisement of Indigenous Peoples?

Firoze Alam
5 min readAug 14, 2019

--

“We can judge our progress by the courage of our questions and the depth of our answers, our willingness to embrace what is true rather than what feels good.” ― Carl Sagan

June is a special month of celebrating diversity for Canadians. It is our month for celebrating pride and National Indigenous Peoples Day. I am a proud Torontonian. I want to take this opportunity to think together with the reader about the diversity issues in the tech industry, focusing on Indigenous Peoples.

The age of the algorithm has made its way from Hollywood fiction to the highways of decision making that significantly impacts our lives. These decisions range from the length of a prison sentence (COMPAS, a risk assessment algorithm used by the state of Wisconsin) to the premiums of health insurance. Simply put, we now live in an algorithm assisted (and dictated in some cases) world. It is critical for us to reflect on who codes the algorithms that can significantly impact individuals. The central thesis of this article is if the ubiquity of algorithms in our lives is just a fact, it is critical that people who are regularly pushed to the margins are brought in the center to have a say in designing these powerful algorithms.

In her book, ‘Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy,’ mathematician and former Director of the Lede Program in Data Practices at Columbia University, Cathy O’Neil shows how algorithms are reinforcing discrimination in the guise of machine neutrality. If an algorithm deems a black woman too risky based on the postal code of her residence, she can be denied a much-needed loan.

Professor O’Neil asserts that training data is one of the critical determinants of decisions made by algorithmic models. If the training data is biased (as found in numerous studies), algorithms will further consolidate the discrimination that has shaped our history for centuries. Winston Churchill said history (in this case, data that trains algorithmic models) is written by the victors. I ask the readers a simple question: Do the victors keep an accurate record of their victims (women, queer people, people with disabilities, people of colour, Indigenous Peoples, and other marginalized groups) to train their tools (algorithms) to ensure fairness? Professor O’Neil argues algorithms are opinions written in computer codes. Diversity in tech can ensure writing code responsibly and improving training data sets. As we lack diversity in tech, writing responsible code or improving training data sets are deprioritized, leading to discriminatory outcomes.

The founder of Girls Who Code, Reshma Saujani, believes “Coding is the language of the future, and every girl should learn it.” In April of 2019, the US lawmakers introduced algorithmic accountability bill that would require companies to examine for biases embedded in the machine learning-powered systems. Mutale Nkonde, an A.I governance expert, has been one of the key contributors in shaping the algorithmic accountability bill. In her recent article, Nkonde shows the sheer absence of diversity in the A.I field. She writes Google machine intelligence team lists 893 people, only one among them is a black woman, and Facebook has none. The marginalized groups need the capability to demand algorithmic accountability and expose the black box (algorithmic models) of decision making for ensuring equity and fairness.

Let me bring your attention to the representation of Indigenous Peoples in tech and my ignorance about Indigenous Peoples. I knew very little about the Indigenous Peoples when I came to Canada in 2014. As a Social Justice graduate student, I had the privilege of learning from many indigenous scholars of Canada. I was baffled by the depth of my ignorance. What I found equally baffling is that many of my Canadian friends and a large number of immigrants shared my ignorance about Indigenous Peoples. I learned about the history of Residential Schools of Canada designed for ‘Killing the Indian in the Child’: the violent history of European assimilation of Indigenous Peoples. I had the privilege of attending conferences of Indigenous scholars in Canada. I met survivors of Residential Schools; we wept together, mourning the loss of Indigenous languages, cultures, and histories.

The lack of gender diversity in tech is well known, and when it comes to Indigenous Peoples, the statistics can get a little too painful to process. I wanted to understand what the diversity data says about indigenous peoples. I decided to dig deep in the diversity data published in the company sites of the following tech giants and found the following: Google (2019) — 0.8%, Apple (2018) — 1%, Amazon (2019) — 0.7%, Airbnb (2018) — 0.24%, Microsoft (2018) — 0.2%. Indigenous Peoples are the least represented group in tech.

Indigenous women are further marginalized in this extreme minority. These numbers speak louder than any voice ever could: Facebook and Netflix could not publish any data that holds even a small numeric significance. When it comes to navigating through the diversity discussions, the beauty of data is that it bluntly tells the story, without resorting to euphemism or sugarcoating. While indigenous populations might be small as a percentage, the tools often used to train Machine Learning systems can end up erasing them completely, and the main defence against this is having people from those communities visible in tech.

We are living in a world that is exponentially becoming digital, and the digital divide disproportionally impacts Indigenous Peoples. The history of European settlement for centuries has brutally plunged the Indigenous Peoples into all kinds of dehumanizing discriminations. While physical European colonialism ended in most parts of the world; it is very much an ongoing project in places like North America. If the world we are creating and curating is becoming increasingly digital, it is crucial that we do not push the Indigenous Peoples to ‘digital disenfranchisement’ again through discriminatory algorithms and lack of access to a growing and influential set of industries. Professor Donna Haraway, a feminist scholar in the field of science and technology studies, says in her book Staying with the Trouble, “it matters what stories we tell to tell other stories with… It matters what stories make worlds, what worlds make stories.” I think it is high time we ask ourselves: Are we conscious of the Indigenous Peoples as we write (code) the stories (algorithms) that make our digital world?

In 2015, Justice Murray Sinclair, said, “What took place in residential schools (in Canada) amounts to nothing short of cultural genocide.” The findings of the 6-year-long Truth and Reconciliation Report that included the testimonies of almost 7000 witnesses, echoes the statement of Justice Sinclair. I recognize the wound is fresh, and shame is almost insurmountable, but we must remember: If we do not actively voice our concerns about the painful absence of Indigenous Peoples in tech, we are on our way to repeating the history in the digital realm. In the words of professor Eliezer Wiesel, Nobel Laureate writer and Holocaust survivor, “We must take sides. Neutrality helps the oppressor, never the victim. Silence encourages the tormentor, never the tormented.”

--

--

Firoze Alam

I write to think clearly! I prefer the discomfort and challenge of not knowing over seeking the comfort of easy answers.