Google’s biased algorithm is more powerful than personalization

If you are a Euphoria fan, I have a question for you: how many episodes were in Season 2?

If you can think backwards through each episode — the two-episode play season finale, Rue’s police-chase, tensions rising between the Cassie-Nate-Maddie love triangle over a few episodes, and the premiere New Years Eve party — you’ll come to know that the correct count of episodes is 8. If you google it now, the search results will say 8. But as the episodes were airing, the answer Google provided me when I was checking how many episodes I had left to savor was 10. This misinformation led to an embarrassing moment for me when the dramatic first part of Lexi’s play concluded and despite the unanimous feeling that the show was reaching its climatic stopping point, I chirped to my fellow viewers, “Good thing there’s still 3 more episodes!” Google — a source whose trustworthiness has been endorsed by society, educators, and peers — was blatantly, objectively wrong. It turns out that this mistake regarding the number of episodes in a TV show is only one instance of Google’s sketchy reporting. The cause of Google’s inaccuracy is due to bias in the design of its search algorithm combined with personalization which steeps people in content that’s agreeable to them, resulting in far more severe and widespread consequences than a moment of embarrassment.

In this blog, I argue that Google is not a trustworthy source of information because its search algorithm is designed to uphold biases present in physical society, and personalization is limited as a result. To make Google’s search results more reliable, it is imperative that the search algorithm designers consult a diversity of experts on each subject to comprise a truthful account of information for that topic.

In Safiya Umoja Noble “Algorithms of Oppression: How Search Engines Reinforce Racism,” Noble explains that Google’s search engine is biased by the racial hierarchies and power systems of the racialized physical world. Google’s abhorrently pornographic responses to her search inquiry for ‘black girls’ provides evidence for the existence of racism in the search algorithms that reflects society’s stereotyping of black girls as the hypersexualized Jezebel caricature (Noble 92). Perpetuating these stereotypes is incredibly harmful to black women’s ability to construct their identities because they are viewed through this stereotypical lens, forcing them to operate under “disabling distortions” which hinders their pursuit of achieving social change (Noble 94).

Several scholars have noticed the threat that Google’s biased search algorithm poses on the pursuit of justice. University of Michigan professor Andre Brock, said that the Internet “represents and maintains White, masculine, bourgeois, heterosexual and Christian culture through its content” (Noble 92). The white-washed nature of the internet suppresses the truths of marginalized groups by “prioritizing and normalizing” whiteness as “the presupposition for the allocation of resources and content”(Noble 92).­ Because Google’s search algorithm heavily favors biased viewpoints, it forces content creators to perpetuate racism and sexism to thrive. Search engine optimization tools are increasingly being used by businesses to get noticed by the algorithm, which shapes the content and offerings of those business services (Noble 84). Google’s algorithmically selected keywords are not only affecting the allocation of valuable digital resources such as advertising by placement on the SERP, but also tangible resources by suppressing marginalized people’s businesses with an algorithm that is inept at rewarding people of color for their work.

Despite this large failing to understand and provide culturally relevant knowledge on marginalized people, Google is still positioned as the “gatekeeper” of information. It is the largest commercial search engine in the world and caters to 700 million people, which comprises approximately half of the world’s Internet users (Noble 21). The massive number of people depending on Google for truthful information makes the need for reversing its racist search algorithms urgent. Otherwise, the misrepresented and oppression of marginalized groups will continue in the global digital realm.

On top of inherently biased algorithms, personalization — which is the customization of information to an individual users’ specific interests — contributes to biased search results and prevents Google from being a truthful source. However, the mechanisms of personalization are forced to operate under the search algorithm, limiting its capability to combat racism on an individual-to-individual basis. Taking Noble’s approach of testing Google by simply searching for girls of different races, I began my examination of personalization within the confines of the biased algorithm by interpreting my search results through the lens of my own knowledge and beliefs, (which Google’s listening devices are probably aware of).

To briefly summarize, I found that my results for ‘white girls’ were sexual, critical, and mocking; my results for ‘black girls’ were celebratory and related to pop culture; and my results for ‘Asian girls’ were virtually blank. All these results reveal biases in larger society at play.

For black girls, the SERP came up with a song by female rappers, images that accurately portrayed a diversity of black girls, and a site to Black Women Who Code. I found that the reflection of my interests was accurately portrayed. the SERP reflected my interest in listening to rap music, idolizing female black artists for their confidence, and even my interest in data science. This SERP seemed highly personalized because it reflected black media that was palatable to me (or Google’s perception of me as a white women). However, I knew the page was not fully representative of the true experiences of black women. In fact, the seemingly positive content I was presented with could be a product of racist viewpoints. According to Dr. Ruha Benjamin, “hypervisibility of Black celebrities, athletes, and politicians can mask the widespread disenfranchisement of Black communities” (Benjamin, 14). Therefore, my SERP on black girls may have only coincidentally aligned with my interests and was primarily a function of a racist algorithm. Another piece of evidence I found to support the biased view of the results page was the alarming absence of a single link to a Black Lives Matter website, which is a prolific current event and social justice issue in the black community. These search results were personalized to my interests and reflected bias apparent in society.

Google search results for ‘black girls’ reflect society’s overindulgence in celebratory black media to avoid confronting black issues. © Google Screenshot — Madeline Trumbauer, 2022

Upon searching for ‘white girls,’ I interpreted the search results as sexualized, critical, and mocking. I noticed that ‘short-shorts’ appeared as a specification suggestion in the images tab, and so did ‘red’, an indication of the sexualization and exotification of red-headed females. There were also hints of criticism around the culture of white girls, as several links proliferated to a book titled ‘white girls’ by a Hilton Als, a non-fiction book critiquing the privilege of white women. There were also tones of mockery, as the film ‘White Chicks’ appeared in the search results, along with images of white girls in a sorority group which is notoriously mocked for being ‘cult-like’ despite simply being a pillar of traditional, organized friendship among females.

Google search results for ‘white girls’ © Google Screenshot — Madeline Trumbauer, 2022

The search results I received for ‘white girls’ were insulting and inconsistent with my beliefs. As a white girl myself, I felt aspects of my being were underrepresented by the SERP. Sure, it’s true some white girls join sororities and have blonde hair and smile nicely for the camera, but there’s also some white girls who are smart. There’s some white girls who are athletic, emotional, mean, gross, angry, talented — the list goes on. Google’s presentation of white girls was only as complete as the patriarchal standard of a ‘white girl.’ The only thing personalized about these search results was the fact that I’m aware of the sexist standard to which I am held in a patriarchal society. These results seemed less borne out of personalization to my interests, and more consistent with a sexist algorithm.

When I searched for ‘Asian girls’, I received several results for stock-images of Asian girls. This struck me as the most straightforward request to my search. The photos weren’t sexual or emotionally-charged. They were neutral images of Asian women doing various tasks, like smiling, jogging, and working. The fact that Google didn’t have much to say about Asian girls — and my interest in Asian culture — was surprising. I do participate in a lot of Asian culture. I cook and buy Asian food, I watch Korean films I’ve even watched lots of Asian-starring US movies and TV shows, like Bling Empire. Like my search for ‘white girls’, these search results were under-representing my conception of the cultural group at hand. These search results seemed oppressive on Google’s part because they didn’t highlight any part of Asian culture, whether it be truthful or stereotypical. These bland Shuttershock photos of Asian women spoke volumes on Google’s behalf. It was as if the SERP was communicating that there was nothing interesting or noteworthy to report on Asian women, culminating in an absence of representation that reflects the real, white-washed world.

Google search results for ‘asian girls’ © Google Screenshot — Madeline Trumbauer, 2022

Reflecting on my search experiment, I knew that modern-day Google searches were being influenced by our real-world racist and sexist society more than personalization. My results for black girls proved to be the most balanced mixture of personalization and biases held by the majority of society, whereas my results for white girls and Asian girls were strongly correlated with the biases held against those groups in society, and the least concurrent with my own personal interests. Although both forces were apparent, my experiment forced me to confront the daunting truth that even personalization is not strong enough to drown out the biases of our racialized society.

One way to counteract the creation and preservation of biases through algorithms is to simply hire more diverse people to design the algorithms. White and Asian men have dominated the tech industry. 75% of employees in tech at Google are men, and 67% of people employed at Google overall are men (Distribution of Google employees worldwide in 2021, by gender and department). They maintain their standing by denying any involvement in supporting the existing discrepancy in employment demographics by taking a “colorblind” approach to racial issues (Costanza-Chock 73). Consequently, patriarchal privileges have become just as apparent in the digital sphere as they are in the physical realm. According to Noble, when solely men create algorithms, they do so “to the exclusion of all women, especially Black women” (Noble 84). Clearly, men alone have not been successful at dispelling truth on the web and should not be entrusted to be the only ones who keep trying to relay the truth. Therefore, having the voices of minority groups in the design space will help represent the truths of marginalized people.

Research shows that design and product teams comprised of diverse workers better understand potential customers (Costanza-Chock 75). Although there are several methods intended to represent marginalized people in the design room, such as user personas, they are not sufficient replacements for directly involving those marginalized people. Attempts by ableist hegemonic designers to imagine the needs of marginalized people are not credible because it is too difficult to fully understand “the lived experiences of others” (Costanza-Chock 84). It is imperative that marginalized designers lead conversations in the design process to ensure these groups are accurately represented.

In conclusion, Google’s search algorithm is not an orator of truth because its algorithm is influenced by a biased society. The built-in bias is more powerful than any personalization at play because any personalization occurs within the limits of the whitewashed media landscape. Therefore, a way to mitigate the bias is to involve a more diverse group of designers in the process of creating algorithms.

--

--