A screenshot of Google search with the query “is google oppressi” and the suggestion “what is oppression google scholar”
Don’t change the subject Google.

Noble’s “Algorithms of Oppression” indexes search engine bias

Amy J. Ko
Bits and Behavior

--

I spent this summer reading a lot about race and technology (McIlwain, Eubanks, Benjamin, Costanza-Chock, and more). Most of my reading has been broadly scoped, focusing on all of technology and its historical and present day interaction with race. In some ways, this has been transformative, giving me an entirely new lens with which to think about the past and the present, and my role in it. But on other ways, this has been overwhelming, since it’s meant grappling with the entire history of technology and race in the United States, particularly computing.

To give myself a break—if I can call it that—recently I turned my attention to a more narrowly scoped book, Safiya Umoja Noble’s Algorithms of Oppression: How Search Engines Reinforce Racism. Dr. Noble is a professor at the UCLA Department of Information Studies, where she directs the Center for Critical Internet Inquiry. She is a colleague in one sense, as we’re both professors at information schools. But in other ways, we are worlds apart. Her background was in library and information studies; mine was computer science. She’s had eclectic academic appointments in African American Studies, Information Science, and Gender Studies across three institutions; I’ve only ever been at an information school. She identifies as Black; I as White and Asian. And yet, through the wonders of interdisciplinarity, I find myself reading her work, building upon it, and teaching it.

The book is focused in that it interrogates search engines in particular. But in its focus, it is also quite expansive, considering search from angles of race, gender, information, culture, history, and politics. The result is a book that is a bit of a maze: it moves swiftly from topic to topic, giving just enough depth to grasp her social critique before swiftly turning a corner into another thicket of sociopolitical complexities behind keywords, indices, and cataloging. I found each section of the book to be part social critique and part policy proposal, mixing examination with manifesto. And despite its unique structure, by the end, I left it with a strong image of the larger architecture of its social critique.

And that critique is as follows. First, the thesis: search engines, by virtue of indexing words, and creating a singular ubiquitous gateway to the meaning of those words, wield great power to shape meaning in society by associating those words with other ideas. In wielding this power, search engine companies have done little to resist or reframe associations between phrases such as “black girls” and the racist, sexists ideas that appear throughout society, especially those in pornography, that are associated with those phrases. Noble argues that instead of resist these associations, through a combination of capitalist motives, ignorance, and disregard for harm, search engines have generally amplified these ideas, and claimed the power to further define them. Noble’s prescription is relatively simple: search engine giants should relinquish power back to oppressed groups, so that they may continue the already exigent work of fighting for racial and gender justice without the added burden of also having to fight big tech. In simpler words: Google, if you’re not going to help, get out of the way.

Reflecting back on landmarks in the book, three connections stuck out to me. The first was the intimate connections that Noble made between library and information science (LIS), Google, and search engines. As a computer scientist, I’ve come to expect someone with an LIS background to present LIS as a discipline disrupted and oppressed by computing, talking about how Google repurposed the citation analysis ideas from LIS in ways that perpetuated racism and sexism. But instead, Noble implicates LIS in addition to CS, showing how the sexist and racist associations that Google has amplified for the word are just as present in the core ideas of indexing, cataloging, and knowledge organization, embedded throughout classification systems used throughout history, and still present in our most prominent institutions, such as the Library of Congress. There is no room in this book for disciplinary tribalism.

A second connection, which I found fascinating, was Noble’s analysis of right to be forgotten laws, which are intended to empower individuals to remove information about themselves from the internet. Here, she does something exciting, which is position the role of Google in retrieving sexist and racist information alongside its role in archiving information. Here again is a connection to a core area of LIS, but it is the juxtaposition that is new: who gets to decide what information as individuals is persisted is intimately connected to our collective ideas about how we are described. Noble shows how Google’s role in this is monolithic.

The third connection that resonated with me—surprisingly but richly described in the conclusion—was a study of a Black small business owner and her interactions with Yelp. Noble provides an extended narrative about Yelp’s disintermediation between the business owner and their customers. The business owner described a long history of forming direct relationships with her patrons—she knew who they were and they knew who she was—and how much meaning that brought to running her business. When Yelp came to town, suddenly the front door to her business wasn’t her smiling face, but a faceless website and its reviewing regime. In this new world, she had no control over how her business was represented online—unless she paid, and only then did that buy so much control. This story, more than anything in the book, spoke to the role that search engines are playing in society: they are a way of taking, centralizing, and privatizing control over what our identities, names, and ideas come to mean to the broader public.

Of course, the book covers far more ground than these three convey; these are just a few of dozens of sights in the book’s dizzying multifaceted social critique of search. But it does so by returning to the same theme of power throughout, each time demonstrating how some aspect of search engines and their creators take power and then use it in destructive and irresponsible ways that do little but enrich their creators. This is a familiar critique from the many other readings on race and technology that I’ve been reading, but one richly described in the context of one specific and particularly important information technology.

While I’m inclined to agree with the critique, the computer scientist in me also wonders whether the attacks on the search engine were broad enough. Yes, search engines shape the meaning of words, destructively. And as Noble points out, so do the information systems that came before them, including classification systems and other forms of knowledge indices. But the very function of language itself is also complicit: when we create words together, we co-create their meaning, and so in a way, all information technology suffers this same original sin of language. Perhaps through their abstractions that we erase difference, assume, create stereotype, and bias. Google may just be the next stage of innovation in a long history of passive linguistic violence.

Does this mean Google should contritely slink away? If the recent justice department lawsuit against Google is any sign, it may be forced to, in some way, at least in the United States and European Union. After all, such a concentration of power is inevitably abused. The question is whether the United States, in clawing back this power, does so in an anti-racist way, or only a capitalist one.

--

--

Amy J. Ko
Bits and Behavior

Professor, University of Washington iSchool (she/her). Code, learning, design, justice. Trans, queer, parent, and lover of learning.