Why Brexit helps feeds social media’s confirmation bias

At this point in time, post-Brexit, Britain is divided. Young and old, London and the rest of the UK, middle and working classes all seemingly holding unshakably opposing viewpoints to each other. Even our political parties are busy tearing themselves apart with two opposing factions. Everywhere you look, there’s disagreement. Except, perhaps, in your social media accounts.

In the afternoon of Brexit, several friends expressed puzzlement, with a fairly frequent refrain: they’d seen nobody on their Facebook feed express a preference for anything other than a Remain vote.

My feed was largely the same. As a relatively liberal London resident working in the marcomms industry, it was hardly a surprise that friends and colleagues were also backing to Remain, even if I didn’t quite share their confidence of a Pro-EU result. But there was also a small sub-group, based in my hometown, who I largely knew through football. A completely different demographic, nearly all of them were supporting Leave.

The polarisation problem

That they showed up in my newsfeed was a miracle at all, but I’m pleased that they did, as it shows that Facebook’s algorithm hasn’t quite selected only like-minded content for my feed. But it is indicative of a wider issue that social media exacerbates — confirmation bias and groupthink. In the case of the Brexit vote, it’s led to a further polarising of the debate and, in my mind at least, some of the outright hostility between differing positions can be ascribed to this.

At it’s most simple, confirmation bias is the tendency to search out and recall information that’s favourable to your viewpoint, while groupthink often occurs when the desire for conformity in a group outweighs the individual’s ability to think critically (it’s particularly prevalent in politics, unsurprisingly, and one of the leading researchers of groupthink, Irving Janus, used the failed Cuban Bay of Pigs invasion as one of his examples).

Ordinarily, you’d expect to find these both occur within subgroups or communities who naturally seek each other out in real life. But the web has given us a particular way of shaping our viewpoint on the world, especially Facebook, which is still the main hub for a vast majority of the UK’s population.

Community confirmation

There’s no need to seek out these groups. Facebook is already filtering this for you as it continually looks to drive engagement in the news feed. Not interested in Syria or Brexit, but very interested in the Kardashians? You’re unlikely to see too much of the former, but a lot of celebrity gossip. Surely, the odd piece may slip in based on your behaviour with other friends (it’s why I suspect the Leave articles appeared in my feed — it’s a group I engage with about football, so Facebook views them as relevant people), but ultimately you’re probably unlikely to be pushed too far outside this bubble.

As a liberal London-based Remain supporter working in marketing, my newsfeed will, rightly, look very different from a Conservative or Labour Leave supporter from the north east who is struggling to find work. Given the types of news articles I read and share, and the friends I regularly interact with publicly on Facebook, the social network has a pretty good idea of what I’m more likely to click on and what I’m not. UKIP or Leave arguments rarely made it into my feed, unless they were from a friend sharing a thorough debunking. Groupthink and confirmation bias is seeping into our world view, whether we like it or not. And if we don’t, we can simply defriend or mute them. It’s why many Remain supporters had a nasty shock the day after the referendum.

I’ve picked Facebook, but the same could easily apply to Twitter or Instagram, both of which have their own algorithms filtering relevant content to you to the top of the feed, while Google’s more personalised search results add to the skew. I may like to think of myself as open to opinions and different life experiences, but I’m unlikely to be following either UKIP supporters or fashion vloggers.

“This post has had enough of experts”

Beyond just my immediate experiences of confirmation bias, there’s been plenty of research around the way we consume information. An Italian study by Michela Del Vicario et al in 2015 found that if stories shared on Facebook fitted in with a user’s pre-existing mindset, then they were more likely to believe the information, even when it was demonstrated to be false.

In his analysis of Del Vicario’s study, Harvard professor Cass Sunstein went even further, noting:

“Arriving at these judgments on your own, you might well hold them tentatively and with a fair degree of humility. But after you learn that a lot of people agree with you, you are likely to end up with much greater certainty — and perhaps real disdain for people who do not see things as you do.
“On the basis of all the clustering, that almost certainly happened on Facebook. Strong support for this conclusion comes from research from the same academic team, which finds that on Facebook, efforts to debunk false beliefs are typically ignored — and when people pay attention to them, they often strengthen their commitment to the debunked beliefs.”

This filter bubble is only likely to get worse. Studies are increasingly showing an upweighting to social media as a news source over traditonal mediums such as TV and print. Facebook’s recent announcement that pages will get lower reach with their latest algorithm tweak is likely to be even more problematic in breaking through the bubble, with publishers likely to take another hit as the social network upweights signals from friends.

In many respects, what Facebook is trying to do is laudable — cutting through the noise to bring you the updates that matter most to you. But if you’re part of a circle of Facebook friends with a particular political viewpoint, let’s say you’re a big Jeremy Corbyn supporter, then it’s natural the algorithm with prioritise this content. (And if you’re a brand with nothing to say trying to cut through: good luck.)

The concept of a filter bubble isn’t new. Eli Pariser’s TED Talk from 2011 touches on all of these points while this marketing post on social media and groupthink from 2012 seems eerily prescient:

“Unfortunately, the parallels between groupthink and social media have created an environment in which it’s nearly impossible to conduct a civil conversation while also disagreeing with a community leader, whatever that community might be. Perhaps you have seen situations where a blogger “calls out” one person only to have that person’s community rush over and “hate” on the blogger because everyone is loyal to whomever their community leader might be. This creates some of the ugliest situations in the online world, but I think a lot of it can be traced to this concept that “dissenting opinions must be punished.””

Much of this is familiar to anyone who has worked with online communities. The Brexit debate combined with social media confirmation bias, has simply pulled community groupthink into a larger part of our political discourse.

Validating your opinions

Those Facebook friends who voted Remain are busy sharing articles showing why we shouldn’t leave the EU. Those who voted Leave are sharing similar articles but calling the Remain vote sore losers. The same is starting to occur with pro and anti Corbyn material and pro and anti Boris content. That petition or meme you shared may get a lot of validation from your immediate community, but it’s unlikely to make any impact in swaying somebody with an opposing viewpoint.

Despite the best efforts of social platforms to introduce more editorial, at the moment, it’s difficult to see beyond the algorithm. As we enter a particularly uncertain period in Britain’s history with strong beliefs on all sides, it’s hard to see a more conciliatory tone emerging as people take to social to air their opinions.

This post first appeared on www.garyandrews.co.uk