In a recent graduate seminar, an idea stumped all of us sitting in the room. I’ll get to that in a minute.
We were discussing the construction of search engines, primarily focused on Google, and how previous iterations of the service returned results that were racist and oppressive for several search items.
We were discussing the results published by Safiya Umoja Noble in her recently-published book Algorithms of Oppression. She was inspired to start researching for this book when she searched the term “black girls”, among other similar phrases, and found that the results were largely sexualized and pornographic.
When the phrase “unprofessional hairstyles for work” the Image results showed Black women with natural hair. The phrase “professional hairstyles for work” showed White women with straight, usually blonde, hair.
If you now search these phrases, the results are “cleaned up” in many regards. There are no longer pornographic results, indicating that the Google employees responsible for the search algorithm have perhaps listened to public complaints and adjusted the technology accordingly.
Misconceptions about Google
In many ways, these changes perpetuate the idea that Google, and similar companies, are “public services” or “public goods.” In other words, they exist to better society by bringing methods for connection and dispersing meaningful information.
While Google may have started out this way, the company has become more than a public service. In many ways, Google dictates how its users operate instead of the other way around.
In addition, Noble mentions in her book that many Google users believe that the results they see are a function of what people search for the most. In reality, this is not true. The results are chosen based on complicated algorithms. Companies invest in search engine optimization (SEO) to ensure their results are listed on page one of Google. The results are not necessarily listed in order of accuracy or relevancy.
Within this context, the question that stumped us during class that day dealt with responsibility: who is responsible for the results we see on search engines (for the sake of simplicity, I will focus this discussion primarily on Google)?
People’s reliance on Google
Understanding this responsibility is important: according to Pew Research, in a report published in 2005, 84% of internet users questioned had used a search engine, with 68% of users feeling that the information they received was fair and unbiased. This survey was conducted 14 years ago and, if redone, would likely yield different results. More internet users have likely used a search engine and an increased number likely access these searching capabilities daily. Given the state of trust in information, I am curious how many Google users still feel that the information receive is fair and unbiased.
This Pew Study includes a handful of quotes from people who say that they don’t use search engines because it is faster to call/contact the business or service they are interested in instead of searching it. I am curious about how many people would even mention this today.
Our information, about both small and large topics, comes from search engines. According to another Pew Study conducted in 2011, many people use Google as a way to access other news sources. Therefore, what results Google News shows users have a real impact. They dictate what news and information searchers will see.
All of this together just further reinforces the complexities of Google’s services. It is a business, first and foremost, that makes money off of advertising. It provides information, good or bad. It can either encourage or discourage different forms of oppression.
The results people get on Google are meaningful.
So, who is responsible?
It is tempting to immediately respond that Google, as the provider of these results, is responsible. They are the ones who own the algorithm and continually tinker with it. But, why should they engage with such thorny issues as factual news and inclusive results? As a business, is there really financial incentive to tinker with the results if people are, for the most part, satisfied enough with the product to continue using it?
Are we responsible for the results? Is the burden of checking for factually accurate, relevant results on the searcher’s shoulders? Should Google users report results that are non-inclusive and inaccurate?
Is it the responsibility of the news organizations most often promoted by Google News to ensure their results remain at the top of Google News?
The answer to this question is remarkably complicated and I am not sure there actually is a concrete, finalized answer (yet). In many ways, it appears to be a shared responsibility of all of us, both those who work at and those who use Google. Google wants to make sure they can stay in business — and users want to make sure they can keep obtaining the most benefit out of whatever products they use.
In this way, I think that users have a large responsibility to critique the flaws built into these products, as Noble did in her book. Google can only stay in business if we continue to use their products, and therefore, we have a large amount of power. Google should then use these complaints and criticisms to further improve their product. Each set of improvements solidifies their position in the future of information-seeking.
According to Google’s website: “Since our search results reflect content and opinions that are already published on the web, in some instances they may surface content that contains biases, negative societal attitudes and practices, or offensive material. If the language of your search query matches very closely with the language used on a more controversial site, you may see that reflected in your results. Such content does not reflect Google’s own opinions, but our belief in open access to information means that we do not remove links to content simply because it contains views or information many people may disagree with.”
The same website lists information about what users can do if they feel that content should be removed because it violates various requirements for content.
In order to fully understand our role within this complex world of modern information-seeking, we first need to understand how the results we see get to our screens. This will provide us the power to effectively critique the system — so that we can help to shape the future of technology in a way that is accurate, effective, inclusive, and honest.
Noble, S.U. (2018). Algorithms of oppression: How search engines reinforce racism. New York, NY: New York University Press.