100% OF FACEBOOK USERS ARE ANTI-TORY

Dan Preston
6 min readNov 16, 2015

--

How confident are you that the your local politicians weren’t chosen by a billionaire in Palo Alto, USA?

Does Facebook have the potential to influence how you vote?

As I stepped into the polling station on the 8th of May 2015 I was confident I had made the right decision. Months of watching political debates, chatting with friends, and ignoring Nigel Farage had helped me formulate a fair and well balanced view point.

However I’ve noticed something bizarre in the months since that day. I have a politically diverse bunch of friends, but whenever I log into Facebook I see nothing but Tory hate. It doesn’t seem to clearly represent the views of those around me, and certainly not the views of a nation who surprisingly and overwhelmingly favored the Conservatives in May.

Something isn’t right here. So I did what any normal person would do. I started counting the political status updates over a period of time and found:

  • 37 Anti-Tory statuses
  • 0 Pro-Tory statuses

Interesting. Does that small sample of 37 posts suggest that 100% of my Facebook friends are anti-Tory? And therefore does my humble friend list of 350 suggest that all 35 million Facebook users are too?

Obviously not. So the more pressing questions are: “is Facebook selecting which of my friends’ political opinions it wants me to see?” and “is Mark Zuckerberg forcing some political ideology on his quest for world domination?”

Our journey for truth begins with a tale of conspiracy, in the depths of Facebook’s most controversial feature — The News Feed.

THE NEWS FEED CONSPIRACY

The Facebook News Feed is a fickle friend. Sucking you in, drawing you away from real life to feast on the humble-brags and highly edited lives of someone you once met in 2009.

Very few people are aware of the mechanics behind it. Facebook is addictive for a reason. It wants you coming back time after time, so it shows you more of what you like, and less of what you don’t. Under the bonnet of your News Feed are a number of algorithms that follow your every move, learn more about you, and tailor what you see.

If you like a status, you’ll hear more regularly from that person. If you view someone’s photos, it will show you more. Its intuition goes further — it gives preferential treatment to certain words in posts and comments (for example a comment containing “congratulations” appears on your feed more often, because it signals a major life event). The formulas behind the algorithm are determined by thousands of factors and updated several times a day. This has a huge impact on the information you absorb…

The average Facebook user has access to 1,500 posts per day but only sees 300 of them. What’s included in the 80% of posts you don’t see ? Your News Feed won’t tell you. In fact 62% of people don’t even know their News Feed is being filtered.

Eli Pariser, internet activist and founder of Upworthy, coined the term “filter bubble” to describe the way we consume information online (watch his brilliant ted talk here).

In pursuit of tailoring our digital experiences, internet services and their algorithms filter out the content we won’t be interested in. “If algorithms are going to curate the world for us,” Pariser said, “then… we need to make sure that they also show us things that are uncomfortable or challenging or important”.

So intentionally or not, the internet is narrowing our view of the world, not widening it, and the News Feed is churning out posts that tell us what we already like and agree with.

But what does that mean for politics?

DOES FACEBOOK INFLUENCE YOUR VOTE?

If Facebook’s algorithm is intentionally showing you content you already agree with, are you seeing a wide enough range of information to make an informed decision?

It’s unlikely that posts on Facebook have the power to convert someone from the Green Party to UKIP. However with one study reporting that 50% of voters in the 2015 election were still undecided as late as March, there’s a significant risk of someone or something intentionally curating the political content we see online.

Earlier this year Facebook published a study to try and prove their algorithms don’t mess with political outcomes. They chose a sample of 10 million users in the US who labelled themselves as Democrat or Republican and tracked their pre-election activity. First they left the algorithm running and counted the number of posts in an individuals News Feed that contained opposing political views. Then they turned the algorithm off and counted again.

The results are surprisingly insignificant: You’ll only see 6% less opposing political views with the algorithm active. That’s just the odd post here and there, and certainly doesn’t suggest that a significant number are being suppressed.

So, what’s the cause of the Tory hate party on my News Feed?

Facebook blames me for living a terribly sheltered life and not making friends beyond my socio-demographic group: the same US study found that only 23% of people in your friends list have completely opposing political views to you.

All that said, something doesn’t feel right about believing the conclusions Facebook gave from the Facebook initiated study undertaken by internal Facebook researchers using data only Facebook has access to.

Can statistics from social media during the 2015 elections help us get to the bottom of this?

SOCIAL MEDIA AND THE 2015 ELECTION

The 2015 election was predicted as being the first election that really harnessed the power of campaigning on social media. It didn’t fulfill this prophecy, but there are a few helpful stats that shed some light and ultimately provide credibility to Facebook’s study:

1. Only 9% of Facebook users openly state their political allegiance. The other 91% of my ‘friends’ may be Tories that just don’t feel the need to post their political opinions online. That’s backed up by stat number 2…

2. Labour had 62% more Facebook shares than the Conservatives (539,802 to 201,535). Labour also beat Conservatives on Youtube, Twitter and Instagram engagement. Perhaps socialists are more likely to a) be on social media and b) share their political opinion on social media. Hence the unanimous Tory hate on my News Feed.

Conclusion

I guess the conclusion is that Facebook isn’t whitewashing any political opinions or forcing ideologies on you. Instead the unanimous Tory hate is probably a result of you not spreading your net wide enough when looking for Facebook friends, or befriending too many cowards afraid of voicing their opinions.

Unfortunately we haven’t uncovered the greatest scandal in modern politics, as I’d loosely hoped when first starting out. Although along the way we have discovered that the political opinions and messages we consume online are dangerously unrepresentative of the political thoughts and movements of a nation. Now I’m off to send friend requests to some fascists and hippies.

***

I’ll leave you with two interesting, but useless, digital stats from the election. UKIP destroyed all other parties on the Facebook engagement front, with an average of 7,000 likes per post, compared to the Liberal Democrats average of just 250. And finally, in many marginal Tory areas, Nicola Sturgeon was the most Googled leader. Her name was a top search query in Theresa May’s Maidenhead, Leicestershire South, and Louth and Horncastle, among others. Fancy that.

Sources include:

Wired

Time

The Telegraph

Integrity Search

--

--