Top tech executives testified on Capitol Hill this week regarding the role of foreign influence and liberal biases in how their products operate. Absent from the hearings was a senior executive from Google, which was notable given the president’s recent claims that Google “rigs” the results of its search engine, biasing results in favor of liberal outlets — what he describes as misinformation.
But based on a study I conducted on the sociological relationship between partisanship and news and information, I would argue that it’s profoundly problematic to lump together questions of foreign influence with an accusation that conservatism is being silenced. Doing so conflates a thwarted attack on the U.S. democratic process with anecdotal evidence that a political ideology is being coded out of the algorithmic design of Twitter, Facebook, and Google.
People believe Google is weighing facts instead of rank-ordering results that match the entered keywords.
Such confusion highlights one of the central findings of my recent study: that most of us don’t understand how Google works. We fail to realize the importance of keywords and that what we type into Google shapes the kinds of results we will receive.
For example, during my study, the president made remarks about the NFL protests, arguing that ratings had declined due to fans protesting the players’ actions. If you Googled “NFL ratings down” on January 25, 2018, the top headlines indicated that NFL viewership had declined this season. Fox News and Washington Times headlines and teasers explicitly insinuated a connection between a decline in NFL ratings and the anthem protests. However, Googling “NFL ratings up” on that same day returned dramatically different results. These links claimed that despite Trump’s remarks, fans were still supporting the NFL.
Essentially, both “facts” were available via Google, depending on what you searched for to begin with.
People routinely put their faith in Google to find out about central political issues and decide who to vote for. Often, they search on political topics to validate or question news from other sources. They believe Google is giving them unbiased and accurate results, weighing facts instead of rank-ordering results that match the entered keywords. But they’re wrong: Google doesn’t weigh political bias; it weighs factors such as what words appear in the article or headline, how many people link to it, and what words people used in their search.
Whether the reader comes back and clicks on other links from the same search results also affects results. If more people click on the third link than the first or second, Google’s algorithm will shift accordingly. Google’s results are thus a kind of instant poll of public opinion about which news the public believes is most worthy of attention. That is not to say that this instant poll isn’t problematic; it’s simply indicative of the broader social structures that shape what we think we know about the world.