Search engines like Google have tremendous power in shaping our views on what is important, relevant, credible, popular, and accurate. According to a 2012 Pew study, 91% of search engine users say they always or most of the time find the information they are seeking when they use search engines. 83% of users state Google as their preferred search.
Most people see search engines as digital libraries where we can find the most relevant and useful information in the least amount of time. Search engines like Google have tremendous power in shaping our views on what is important, relevant, credible, popular, and accurate. All of have experience with the usefulness of “googling it,” but to see it as a public resource obscures Google’s place as a multinational ad broker. Because of the contemporary ad-centered media business model, the design and outcome of tech privileges dominant narratives about the world. And in a world where dominant narratives can — and often do — have oppressive effects on marginalized groups, digital tools like search engines can reinforce current and historical inequalities.
Digital tech provides new ways in which powerful individuals and corporations from elite populations can essentially “own” culture (i.e. the digital representation of a specific culture or its individual aspects) through search engine optimization. It is with these ideas in mind that we need to think about how search engines like Google can unknowingly promote colonial ways of ownership, or what human rights and tech lawyer Renata Avila calls “digital colonialism.”
To illustrate this point, look at the Google search results for “Ubuntu.”
Ubuntu is Zulu word that roughly translates to the idea of universal connectivity — that our individual fates is bound to the collective and vice versa. It is an African way of understanding the world through communal power, accountability, and responsibility. Within the African diasporic community, Ubuntu is frequently translated to “I am because you are.”
After clearing my search engine history and cookies on my browser, I did a Google search for “Ubuntu.” The top result is to Ubuntu, an “open source software operating system that runs from the desktop, to the cloud, to all your internet connected things.” On Ubuntu’s site, the word is described as an “ancient African word meaning ‘humanity to others’.” The company describes its mission as both social and economic, stating that it aims to bring “the spirit of Ubuntu to the world of computers and software.”
When I looked at Google’s suggestions for the keyword “Ubuntu,” the search rendered phrases such as “Ubuntu download”, “Ubuntu server”, “Ubuntu 18.04”, “Ubuntu 17.10”, and “Ubuntu Linux.” The long tail “Ubuntu meaning” was the sixth suggestion.
If we accept the prevailing idea of Google Search as a fair, democratic, and objective platform that gives you the most important, relevant, credible, popular, and accurate information, we’d also have to accept a couple conclusions from these search results:
- the most relevant thing about “Ubuntu” is that it is the name of an operating system.
- the origin and meaning of the word “Ubuntu” is the 6th most relevant thing about it.
Regardless of how you assess of Ubuntu’s social and economic mission, the fact remains that this tech company ostensibly controls the digital representation of a fundamental concept to many African and African diasporic philosophical and spiritual traditions. And because search optimization is a key component to any tech company’s marketing strategy, it is in their economic interest to do just that.
We don’t have to ascribe nefarious intent to Ubuntu the company to acknowledge that guiding users to their site and products (as opposed to resources from African scholars explaining the concept of ubuntu) is in their business interest. We can also see how Google Search becomes a tool that facilitates and augments this process.
This is why it’s necessary to think about how search engines reinforce colonial ideas about ownership. Through a colonial gaze, resources, either literal (land once owned by indigenous peoples) or cultural (ideas of practices such as “Ubuntu”), can be claimed if they are not “owned.”
When arguments over claims of cultural appropriation arise, the most common rebuttals are questions about “who owns culture.” Without being skeptical to the intent behind them, these calls to acknowledge the complexity of debates on cultural appropriation hide the fact that we pretty much have decided who gets to own culture — individuals and corporations. The fact that a tech company has more power to shape the common user’s relationship to “Ubuntu” than the Africans and Afro-descendant people from which it originates is not a static phenomenon, but the product of both contemporary and historical belief and value systems that make it so.
If Google Search is ultimately a democratic tool, who from the African diaspora can petition to control their cultural image on digital platforms? If they have a “legal” standing to make this petition, where would it be directed, to Google or Ubuntu? Can only those who “own” the word “Ubuntu” make this petition? How does “who owns culture” discourse arise in relation to cultures of people of color and indigenous peoples, particularly Black people, in ways it does not for white, Western culture? How can the individualist and libertarian impulse of the tech world continue a form of digital colonialism? And how does our increasingly tech-driven world muddy this discourse?
We need to think about how the digital representation of marginalized groups is increasingly being dictated by powerful and influential companies.
In her book Algorithms of Oppression: How Search Reinforce Racism, USC professor Safiya Noble discusses how society’s biases about women and people of color were reproduced in search engines results. For example, in 2011, “Sugary Black Pussy” was the top hit when you searched “black girls.” This reflects both the porn industry’s imperative to influence search results associated with women and girls, and historical narratives painting Black women as hyper-sexual objects. UCLA professor Ramesh Srinivasan’s Whose Global Village? discusses how the prevailing idea of the internet as a utopian, global democracy blinds us from the ways it marginalizes indigenous communities (but also how these communities use tech to promote their culture and combat their marginalization).
Though seen as ubiquitous in places like the U.S., global internet connectivity is far from equal. Most people in the world do not have smart phones, and do not use the internet everyday. So how do prevailing ideas about the democratic potential of the internet and search engines blind us from the ways these tools privilege the values, beliefs, ideologies, and ontologies of the Western world?
Though these are serious topics that deserve thoughtful study and research, they show the how digitals tools can reinforce oppressive social structures. Thinking critically about the search results for “Ubuntu” reveals how free market beliefs and values are encoded into search engines, and how users are, often unknowingly, accepting the value systems of corporate and colonial individualism and neoliberalism.