How Technology Subconsciously Manipulates Your Vote

You Might Be Influenced By The Very Device You’re Holding Right Now

What if your thoughts, actions, and vote were being influenced by the very device you’re reading this on right now?

As the midterms approach in the USA, I thought it would be interesting to delve into the way the technology we use shapes the decisions voters make. What I found out was terrifying.

The Search Engine Manipulation Effect

Would you believe that you would be more likely to vote for someone just because they ranked higher in a Google Search?

What we’re talking about here is a means of mind control on a massive scale that there is no precedent for in human history

That’s a quote from Robert Epstein, a research psychologist at the American Institute for Behavioral Research. He ran a study that found a link between the ranking of politicians on search engines and the likelihood someone would be swayed to vote for them.

We’ve grown accustomed to trusting the first few results of web pages, Google themselves define these results as “the most relevant and useful”. Unfortunately, we carry that same sentiment over to politicians.

The politicians who rank higher in search aren’t usually the best politicians, they’re the most controversial ones. And that rank means we subconsciously trust them more.

To test how the internet can affect our opinions, Epstein created a fake search engine called “Kadoodle” that was biased to rank results for one candidate over the other. Obviously, people would end up clicking on the top ones more often.

You’d think that this quick search wouldn’t make much of a difference, but biased search results increased the number of undecided voters choosing the candidate that ranked higher in the search by 48%.

This is huge- Elections can be won on margins as low as 1%, so ranking higher in search can be the difference between winning or losing.

How does this affect current elections? Unfortunately, the politicians who rank higher in search aren’t usually the best politicians; they’re the most controversial ones. And that rank means we subconsciously trust them more.

Controversy can ignite intense discussion on online platforms which push the rankings of these politicians higher up, and therefore, give them an advantage when it comes to voters.

Think that’s bad? Unfortunately, it gets worse, because some companies can target you directly.

Facebook’s Vote Button

In the Icelandic elections in 2017, Facebook showed certain users a “vote button”, under which were the lines “Find out where to vote, and share that you voted.” Under that was smaller print saying that 61 people had already voted.

The mere addition of this button did wonders to increase the voter turnout. In a study Facebook published in 2017 in the science journal Plos One, they showed the effect of the button — through its influence, 270,000 additional votes were cast.

Although they increased the voter turnout by a seemingly tiny 0.24%, that small amount can turn elections; Al Gore lost to George W Bush by 0.01% in Florida in 2000.

The real danger here comes with the fact that Facebook has the ability to choose exactly who sees these messages. That means that the public needs to trust them to show it to everyone impartially.

If a certain group of people sees it more than others, say, Democrats instead of Republicans, Facebook is giving one side an advantage, and therefore, undermining democracy.

Echo Chambers and Confirmation Bias

A study of over 2.7 billion tweets between 2009 and 2016 showed that Twitter has a big political echo chamber problem. Users very rarely shared or consumed political content that differed from their personal views.

This isn’t a phenomenon specific to Twitter either, the internet is a powerful engine for giving you information, but our human wiring pushes us into finding only the information we want. This phenomenon is called confirmation bias

Confirmation Bias — When we believe something, we unconsciously begin seeking out information to reinforce that belief, often in the absence of facts

Naturally, if you don’t agree with the views of someone, you’ll probably ignore them. But this can cause you to stay firm in your views even when they aren’t justified.

What do these echo chambers cause? To give one example, HPV vaccine intake fell 15% amongst girls in Ireland, a decrease largely attributed to fake news.

As Aleks Krotoski explains in Untangling The Web:

Difference is inspiring, catalysing and progressive. Social psychological research over six decades has found that inward-looking groups, online or off, will have less tolerance for the other. They’re more antagonistic, confrontational and bigoted.

Of course, while fake news is a symptom of echo chambers online, it is so bad that it needs its own section.

Fake News

A study of over 10 million tweets from 700,000 Twitter accounts that linked to more than 600 misinformation and conspiracy news outlets had some revealing findings about misinformation online.

Many of the accounts that spread misinformation in 2016 are still active —

More than 80 percent of accounts that repeatedly spread misinformation during the 2016 election campaign are still active, and they continue to publish more than a million tweets on a typical day.

Fake News that circulates online is crazy, with articles like:

“White House cleaning staff find Obama’s secret stash of drugs”
“Trump enacts 90-day ban on childhood vaccinations”
“Illegal immigrants started the California fires”

And this isn’t something specific to Twitter. Platforms like Youtube have some crazy amounts of disinformation that could be reaching you every day.

Youtube’s Extreme Algorithms

After the Stoneman Douglas shootings, a video reached the trending page on Youtube that claimed that David Hogg, an outspoken gun-control advocate and shooting survivor, was a “crisis actor”.

Eventually, Youtube took the video down, but the damage was done. The video accumulated hundreds of thousands of views before it was removed.

Its algorithm doesn’t do much to help either. Zeynep Tufekci, a prominent communications researcher, found that Youtube tended to push you towards extreme, misleading, or outright false content on any content you search up.

Search for Donald Trump, and the algorithm eventually surfaces white supremacist rants. Search for Hillary Clinton and it recommends 9/11 conspiracy theories. Search for vegetarian food and it recommends veganism. Search for running and it recommends ultramarathons.

Conclusion

Fake news, echo chambers, confirmation bias, search engine manipulation and voter buttons. While it may seem overwhelming, these are only a few ways you can be influenced.

A democratic society will only function if its citizens are well informed. This means that every individual must be alert and vigilant to make sure that they aren’t being manipulated.

It’s important that in a time defined largely by technology, we don’t let technology define what we want.

Thanks for reading,

Sarvasv

This story is published in The Startup, Medium’s largest entrepreneurship publication followed by +385,976 people.

Subscribe to receive our top stories here.