Google’s Racist Search Results
(Algorithms of Oppression, Safiya Noble, New York University Press, February 2018)
The politics of technology design
The internet is not the neutral, unbiased warehouse of all things. Search, for example, is loaded with ingrained prejudices from our culture and history. Safiya Noble, a black feminist, was incensed when she searched ”black girls” on Google and came up with listing after listing of porn. All she wanted was some activities suggestions for her stepdaughter and her visiting cousins. She had to close the laptop lid before they saw the results.
If it weren’t bad enough that only porn resulted, look at Google’s autocomplete suggestions for a search on “women”:
-Women cannot: drive, be bishops, be trusted, speak in church
-Women should not: have rights, work, vote, box
-Women should: stay at home, be slaves, be in the kitchen, not speak in church
-Women need to: be put in their places, know their place, be controlled, be disciplined
The result is Algorithms of Oppression, a six year project to determine the extent of this poison, how it came to be, and what should be done. Noble found western society itself at the heart of it.
Google, incredibly, denies any responsibility. It says its algorithm operates on its own and they can’t train it. This is what we call a lie, as Google has managed to abide by all kinds of European directives against hate, Nazi products for sale, and the right to be forgotten. And magically, the black girls search results have been evolving too, as Noble shows in her many screenshots.
We like to believe that what rises to the top in search is whatever is most popular and most relevant. But we fool ourselves. There are classification systems at work, and Noble says blacks have been “contained and constrained“ by them. The search “beautiful” results in an endless page of photos of white women. Not Starry Night, Niagara Falls or the Taj Mahal, but white women. A search for “professor” brings photos of only white men. And a search for “unprofessional hairstyles for work” shows only women of color. As you might guess, “professional hairstyles for work” shows only white women.
And it’s not as if Google has customized the results according to Noble’s search history. She has spent several years using Google in her pursuit of a doctorate in black feminist studies. And this is how Google profiles her.
Basically, Google’s search algorithm represents the white male view of the world, she says, and brings up results to fulfill that need. Black community or society is simply not part of the equation, and therefore not part of the algorithm. Same goes for women.
Noble has a chapter on libraries, because librarians classify everything. They must of course, in order for anyone to do any sort of in-depth research. Yet the very act of classification is discriminatory. Irish Catholic, Korean American, black feminist — are all problems looking for homes. Everyone becomes an “objectified symbol“ to someone else. Leo Buscaglia spent his life ranting against this. Because of these labels, we think we know something about this person, he used to say, but we don’t at all. This same built-in bias shows up in online search. It is not in any way neutral.
As exhaustive as she has tried to be, Noble made no effort to stem the tide. Her screenshots do not also show results when Google’s Family Filter is on, and she never tried searching with negative terms to block the sex listings (Black girls –sex). It’s almost certainly true that most people can’t be bothered with either of these tactics, but Noble should have included their results.
Unfortunately, she concludes that Google search be federally regulated. This despite her entire book demonstrating the embedded, if not innate bias throughout every aspect of western society. It’s not an especially hopeful ending, and really just skirts the whole core issue.
We are nowhere near being postracial.