Listen to this story
Top tech executives testified on Capitol Hill this week regarding the role of foreign influence and liberal biases in how their products operate. Absent from the hearings was a senior executive from Google, which was notable given the president’s recent claims that Google “rigs” the results of its search engine, biasing results in favor of liberal outlets — what he describes as misinformation.
But based on a study I conducted on the sociological relationship between partisanship and news and information, I would argue that it’s profoundly problematic to lump together questions of foreign influence with an accusation that conservatism is being silenced. Doing so conflates a thwarted attack on the U.S. democratic process with anecdotal evidence that a political ideology is being coded out of the algorithmic design of Twitter, Facebook, and Google.
People believe Google is weighing facts instead of rank-ordering results that match the entered keywords.
Such confusion highlights one of the central findings of my recent study: that most of us don’t understand how Google works. We fail to realize the importance of keywords and that what we type into Google shapes the kinds of results we will receive.
For example, during my study, the president made remarks about the NFL protests, arguing that ratings had declined due to fans protesting the players’ actions. If you Googled “NFL ratings down” on January 25, 2018, the top headlines indicated that NFL viewership had declined this season. Fox News and Washington Times headlines and teasers explicitly insinuated a connection between a decline in NFL ratings and the anthem protests. However, Googling “NFL ratings up” on that same day returned dramatically different results. These links claimed that despite Trump’s remarks, fans were still supporting the NFL.
Essentially, both “facts” were available via Google, depending on what you searched for to begin with.
People routinely put their faith in Google to find out about central political issues and decide who to vote for. Often, they search on political topics to validate or question news from other sources. They believe Google is giving them unbiased and accurate results, weighing facts instead of rank-ordering results that match the entered keywords. But they’re wrong: Google doesn’t weigh political bias; it weighs factors such as what words appear in the article or headline, how many people link to it, and what words people used in their search.
Whether the reader comes back and clicks on other links from the same search results also affects results. If more people click on the third link than the first or second, Google’s algorithm will shift accordingly. Google’s results are thus a kind of instant poll of public opinion about which news the public believes is most worthy of attention. That is not to say that this instant poll isn’t problematic; it’s simply indicative of the broader social structures that shape what we think we know about the world.
Phrasing matters. Take, for example, two very similar searches surrounding an advertisement paid for by Americans for Prosperity, a conservative political advocacy group funded by the Koch brothers. The ad that repeatedly aired on television and Facebook argued that the Democratic candidate for governor (Ralph Northam) was incompetent because he had “approved the spending of $1.4 million in taxpayer money to a fake Chinese company with a false address and a phony website.”
My research demonstrates that Google can actually drive the public toward a silo of conservative thought.
If you Googled “Northam fake Chinese company” on January 25, 2018, you were provided articles from the Richmond Times-Dispatch and the Washington Post that summarized the claims in the ad. But those more interested in fiscal responsibility might have focused on the monetary figure repeatedly used in the Republican candidate’s ads and rallies.
By simply adding “$1.4 million” to the search on the same day, Google returned dramatically different, conservative-leaning content. The top result was an opinion piece by the Republican Governors Association; the second link was an op-ed by a conservative politician. A few hits down was a direct link to the organization that paid for the ad, and following that was a link to Fairfax Underground — a forum that frequently claims the Democratic Party is trying to “break the back of white, middle-class America” by “importing millions of brown people to dilute white votes and remove Christianity from the public square.”
If anything, this search leans conservative and verges on racist misinformation, burying liberal perspectives.
My research demonstrates that Google can actually drive the public toward a silo of conservative thought. For example, users curious for more information on the connection between Nellie Ohr and the Department of Justice—a topic widely discussed both on the QAnon message board and Fox News—would have received predominantly conservative perspectives if they queried her name on August 6, 2018. The top result was a piece by conservative think tank the American Spectator, the second and third links are from Fox News, followed by two more links from conservative news sites.
This is also true on YouTube and Facebook, which conservative organizations like PragerU have accused of “demonetizing” or “restricting” its content. To be sure, some of PragerU content has been removed by YouTube and Facebook, but not by executives; it was removed by contracted moderators. A majority of content moderation for various platforms is outsourced to citizens living in countries around the globe, who have roughly four seconds to decide if content should stay or go. While PragerU might be under constant content moderation, it is correlated less to an overall bias in executives and more with how the community of users are encoding their videos and determining they are inappropriate.
Once Facebook learned that PragerU content was removed, the company unblocked the content and issued a public apology. Ironically, if people turning to Google today to learn more regarding PragerU’s lawsuit, they will receive only content that favors PragerU’s position.
What this indicates is that conservative perspectives abound on Google, Twitter, and Facebook. If anything, my research demonstrates the complexity of these platforms and the ability for users to utilize these spaces to confirm any truth they wish.
Ultimately, Congress moving forward with testimony that top executives are silencing specific viewpoints only demonstrates the extent to which the public fails to understand how the algorithms work. If Google is returning only “negative stories” about the president, it’s not because an executive at Google programmed it that way. Instead, the results for “Trump news” is a reflection of what a majority of Americans consider “real news,” which is exactly what they were searching for.