On March 24, 2018, Mike Hughes launched himself in a self-made rocket to the height of over 500 meters to see if the Earth is flat. Not only did he survive, but he remained skeptical about the roundness of the Earth, concluding that he needs to go higher to be certain.
While his effort and dedication are admirable, one might ask why in the 21st century — over 2000 years after Aristotle proved that the Earth is round and almost 50 years after we landed on the moon — do some people still believe that the Earth is flat?
But it’s not just the flat earth. 40 years have passed since WHO declared the eradication of smallpox and more and more people are refusing to vaccinate their children with claims that vaccines are bad. The ice caps are melting, acres of land are turning into desert, and hurricanes are striking more often, and yet people are still denying climate change.
And there are all the seemingly harmless conspiracy theories, such as UFOs, Reptilians, and the One World Government.
Has it always been like this or are we humans somehow evolving backward? Crawling back into the oceans of ignorance? How is it possible that in an era of almost unlimited access to information (at least for the privileged), we choose conspiracy fairy tales over scientific facts and research? Is this the end of civilization or some new stage in humanity’s development?
The Sad Reality
The Internet was supposed to fix, maybe not all of humanity’s problems, but at least a nice chunk of them. It was supposed to bring knowledge to remote villages, emancipate the uneducated, and eliminate barriers and social inequalities. We expected so much and ended up with so little.
Sure, we got cute cat videos! But we also got the anti-vaccine movement.
Actually, Lem never said that, though I’m sure he’d agree with this statement.
Humans have always been more prone to trust their “gut feeling” instead of scientific data, burning Giordano Bruno at the stake for heresy, fearing black cats, and investing in pyramid schemes.
What’s different now, compared to previous eras, conspiracy theories can spread like viruses thanks to new technologies. And the giant media corporations (I’m looking at you, Google and Facebook) are enabling them.
Why is Google to Blame for the Nonsense Epidemic?
We’re used to blaming big corporations for many things: environmental damage, social inequalities, moving jobs to developing countries, evading taxes. Depending on the given corporation, these charges are more or less right.
But why on Earth would you blame Google for the anti-vaccine movement? Even though they recently removed “Don’t be evil” from their code of conduct doesn’t mean they’re that evil, right?
A study from 2011 shows that over 24% of the websites included in Google’s first page of search results for vaccine-related keywords were anti-vaccine websites. Even worse: 70% of the results of searches conducted in the US for the term “vaccination” were against vaccines.
2011 is almost ancient history, especially in terms of the search. Just to remind you, we’re talking about the days when Google’s result pages looked like this:
In a more recent study, conducted in 2014, 13% of the top 10 search results for different vaccination-related keywords were websites that didn’t support vaccination. In a study from 2018, between 12% and 24% of the websites presented in SERPs for the phrase “autism vaccine” were anti-vaccine ones (results differed depending on the country’s version of Google).
The mentioned studies are separated by years and the Google algorithms have certainly changed significantly during that time. However, it’s interesting to see what methods the researchers used to see the most “general” (not personalized) results in Google.
The 2014 study’s searches were conducted with a cleared browsing history and without signing into an account (including a Google account).
In the 2018 study, the researchers used local versions of Google’s search engine.
In the 2011 study, there’s no mention of any measures taken to minimize Google’s filter bubble. It’s not surprising — after all, the study was conducted before Eli Pariser published his book on the filter bubble effect in 2011.
The Filter What?
As ridiculous and innocent as the term “bubble” might sound, it’s used to describe a dangerous phenomenon. It’s created by various algorithms that tailor the content we see in our newsfeeds and search results to meet our expectations, determined by our browsing history, location, and other factors. Sounds good. Who wouldn’t like to see the results that are the most relevant to them?
When we investigate a little closer, it turns out it isn’t as good as it seems. The algorithms want to please us, so in order to do so, they’ll show us the opinions and results that we agree with.
While it’s completely harmless in some cases (I hope Google knows by now that I prefer cat videos to dog videos!), in other cases it might threaten a democracy (as voters aren’t able to conduct objective research on a candidate when they’re presented with articles that support their biases) and bear even deadly consequences (when a parent decides to not vaccinate their child because of the anti-vaccine articles Google presented them based on the activity of their friends).
The problem with Facebook’s filter bubble has been already widely discussed and at least some people realize that their newsfeed is tailored to fit their needs and desires. But what about Google? Their personalized search feature was first introduced in 2004 (!) and has been used for all searches since 2009. It’s been 10 years! People have surely become accustomed to seeing personalized results.
The question is, are they aware of their existence?
The feature itself has since evolved. When it was introduced, users were able to see the personalization that took place and on which basis it was taken, so more curious searchers might have discovered that their search history affects the new searches.
In subsequent years, Google retreated from such transparency. These days we only know that our results are personalized in some way, based mostly on the history of previous searches, all the pages the user has visited, the location of the device on which the search is conducted, and social signals.
There’s no clear statement which data is used for personalization (or at least I couldn’t find one). In the Google Account setting, you can choose whether your “web activity” is used for personalization or not. But what exactly is this “web activity”?
This description might seem detailed but is in fact vague. What are “other things”? The fact that Google uses information from the device “like recent apps and contact names” is also quite alarming.
Why does it need my contacts for, especially in terms of delivering more relevant search results? Are they going to call my mother to ask her about my cats versus dogs preferences?
The recent scandal with Facebook revealing the private messages of its users to third parties (Spotify, Netflix, and Amazon) and Cambridge Analytica proves that we should have very little, if any, trust in tech giants.
Call me a skeptic, but the fact that we haven’t heard anything like that from Google doesn’t mean they don’t do such things. It might just mean that they’re more cautious.
Now imagine someone who’s already skeptical about vaccinations searching for information on vaccines. They visited some anti-vaccine pages in the past, they’re friends with parents who don’t vaccinate, and they share their concerns towards vaccines in emails.
Guess which results will they get? Will they see at least one result that is pro-vaccine or will they receive just objective results?
Google now claims that there’s very little personalization going on, focusing primarily on the user’s location and the context from their previous searches. That sounds exactly like something someone doing extensive personalization would say…
However, the reason only limited personalization takes place is the difficulty for Google algorithms to guess the user’s intention correctly. It’s very possible that once this technical issue is solved, Google will once again push us in the direction of very personalized results.
A recent study conducted by DuckDuckGo (a Google competitor focused on user privacy and transparency) shows that personalization not only exists but is also impossible to turn off. The study shows that searches conducted in incognito mode at the same time didn’t present the same results.
Moreover, personalization is not the only feature that might lead to bias. The way the search query is formed reflects the user’s assumptions and might affect the results shown.
After all, searching for “are vaccines good” is something different than searching for “are vaccines bad.” In each case, Google will happily provide us with websites that confirm both statements.
What’s the Harm?
The threat of the increasing popularity of the anti-vaccine movement is quite obvious: as more people refuse to vaccinate their children, vaccination rates will drop until an epidemic breaks out.
But what’s the harm in people believing in the reptilians and flat earth and in Google handing them more and more articles that support those beliefs?
The issue is more complex than it might seem.
Personalization and the bias it causes is a threat to the very values our society relies on. Everyone can be an expert on anything. There’s no absolute truth anymore, there are many truths to choose from, and we can have it any color we like. We don’t even have to make a choice ourselves, Google will hand us the options we already like.
We can observe the effect of the bubbles right now. After the 2016 US presidential election, many Democrats found themselves asking, “How is it possible that Trump won while everybody I know voted for Clinton?” Of course, people have always gathered in groups and tribes of shared worldviews and beliefs, but the media kept them in touch with the rest of society.
Nowadays, it can indeed be very surprising that not everybody shares our opinion when we’re constantly flooded with news that not only confirms it but also creates the impression that other beliefs are nonexistent or marginal.
As we move forward, we might find that we’re living in a world that’s completely different from the world of our neighbors. As our biases grow and our bubbles get thicker, will we be able to communicate with those strangers, let alone agree on something and cooperate?
The Truest Truth
Personalization is very handy, but maybe the time has come to ask ourselves if the cost is really worth it. We might want to think about what lines we should draw and where we should draw them.
How far can tech giants reach into our lives to find our preferences they can then adjust? How far should personalization reach? Should we create a set of topics (such as politics and science) where personalization shouldn’t be allowed?
There’s also another question: is including non-scientific websites (such as anti-vaccine sites) in the Google index ethical?
Of course, you might say, we have freedom of speech, even though science agrees that vaccines are safe and save millions of lives. But we set aside the freedom of speech when someone else’s freedom is at risk. There’s no child pornography or websites selling drugs in the Google index. Isn’t securing a child’s freedom not to die from an easily preventable disease a good enough reason?
Unfortunately, I don’t have a clear answer but it’s important to point out a problem this serious.
As the aforementioned study conducted by DuckDuckGo concludes, there’s no way to switch off Google’s filter bubble completely. As far as we know, the best solution for fighting this effect is to realize the problem exists. When it comes to unscientific theories, our best and only weapon is science itself.
This also translates to the world of website optimization. Many self-proclaimed gurus claim to have the one and only answer to send your website up the rankings, much like Mike Hughes’ rocket. The only difference is your business might not be as lucky with the landing. Big words and big promises might be tempting and romantic but when it comes to constructing a rocket, a vaccine, or building your business’s visibility, but it’s better to trust science with its boring research and data.
At Onely, we’ll do the boring part for you when it comes to research and data. We live in a world full of different truths and meanings, but only by abiding to research, data and execution, we bring your website closer to the truest truth.