Digital Decolonization: Native Inclusion in Technology

Victoria Powell
b8125-fall2023
Published in
4 min readNov 19, 2023

As an Indigenous person in business, I carry an enormous burden to represent and educate others on my culture. Often, I am the only Native person in the room, on a project, or at an entire company — and in any room I’m not in, Native issues often fall silent.

As many US companies are <1% Native American — including at the director level in the S&P 500 — allyship is incredibly valuable in filling that silence in most boardrooms. Through proactive inclusion in digital transformation initiatives in data and AI, companies can improve Indigenous perspectives in rooms even when Indigenous voices aren’t actively present.

Native Allyship through AI

“If you can’t change them, absorb them until they simply disappear into the mainstream culture”- former US Senator Ben Nighthorse Campbell (of the Northern Cheyenne) on the Native perspective on termination policies’ intent

Historically, Native Americans have battled with erasure in a variety of different forms, which has created great sensitivity around honoring appropriate representation. In the 20th century, Indian boarding schools aimed to “Kill the Indian, save the man” caused the forced removal and death of many Native children. This “reeducation” meant to create “civilized” and “Americanized” Native populations, erasing our culture. Indian termination policies, too, perpetuated erasure of Native culture well into the 20th century.

And in the 21st century, we are fighting the next frontier: algorithmic erasure.

Artificial Intelligence is trained on the past — a past fraught with sexism, racism, and harmful biases and stereotypes. Studies have already shown that AIs have exhibited biases, potentially increasing the risk of racial profiling and other systemic issues. Native communities, in particular, have not controlled their own narratives online until recently, as many Native American reservations historically had extremely limited access to the Internet. Before the COVID-19 pandemic, NPR reported that Tribal lands were the “least connected” regions for high-speed internet, with only half of Natives on reservations having access. On the other hand, Native fetishization and the popularity of racist media, such as many old western stories, has saturated the public media. Thus, the very data that AI is trained on will underrepresent Indigenous perspectives and perpetuate harmful stereotypes and biases.

When asking ChatGPT a very simple question, “What is a Native American?” the very first paragraph evokes a sentiment of erasure, using exclusively the past tense and framing it in the euro-centric context of Christopher Columbus. A Native perspective, on the other hand, would center Indigenous Peoples in the present tense, speaking to the cultural community and framing tribes as sovereign nations.

Thus, today, the training data input into AI training algorithms is not satisfactory to represent the native perspective. However, there is promise in AI’s opportunity. An LLM trained with authentic Native voices has the power to relieve the burden of teaching non-Native peers about important Native issues. So, too, can it help build knowledge within Indigenous populations, which often face limited access to higher education, in areas such as tribal governance and building Indigenous economies. And perhaps, one day, it might build upon cultural preservation, supporting new generations in learning Native arts, languages, and history.

Indigenous Implications in Data

The data that feeds into LLMs or business decisions is not to the standard required for ethical Indigenous inclusion. Data collection and representation of Native identities is often mishandled in a way that perpetuates structural racism. In the 2020 presidential election, CNN made headlines for its critical error in reporting the Indigenous votes — a critical cause of a battleground state’s flip to Democrat — as “something else,” leading to Indigenous outcry. All too often, Native Peoples are categorized in an “other” bucket, erasing the voices of Native consumers in business decisions.

These categories dehumanize Indigenous data, and leads to the modern version of erasure that our communities face. By transforming data collection protocols and usage standards, the availability of Indigenous insights can expand.

Big data = Big (negative) impact: As company data collections grow bigger, the need for dashboard and standardized data analysis becomes greater. However, in this great reshuffling, the aggregate numbers of larger identity groups make smaller groups like Native populations difficult to visualize in the same field. Or, for a standardized time period, the sample size of Native respondents may be too small to represent. By standardizing big data approaches, the temptation for aggregation buckets grows. Retaining or re-introducing small data methodologies is paramount to re-centering Native voices.

Identity diversity: Data interpretation often allows for only one identity to be represented, listing those who check more than one box as “two or more races.” However, among those who are categorized as “two or more races,” nearly 7 in 10 of them identify as mixed with American Indian. Thus, when an algorithm or analysis does not account for the complexities of multiracial identities, it will erase the mixed Indigenous identities created by the historical complexities of colonization.

Conclusion

Without appropriate considerations and precautions, digital frameworks risk developing into structural racism. However, through intentional Indigenous inclusion and ethical data protocols, there is great opportunity to generate widespread knowledge on Native topics and insights for inclusive business decisions.

--

--