Echo chambers threaten Democratic life!

Giacomo Cacciapaglia
7 Star Circus
Published in
6 min readMar 27, 2021

Many inhabitants of this planet, like myself, still have a fresh memory of the days following the 2020 Election in the US. On TV and on online news sites, charts where continuously shown, turning from blue to red to blue… The electorate was so polarised between the two candidates that the results was hanging on a thread for a long time. The result was so close, 70 million votes apiece, with the difference being in the few millions. No matter who finally collected more delegates, a clear divide is visible here.

This 50–50 situation has not been uncommon in the democratic countries, in recent times. The (in)famous Brexit vote is a well-known example: a life-changing decision hanging to the decimal digit. The whole European Union project was initiated and pushed by one of the greatest British prime ministers, Winston Churchill, who was dreaming of the United States of Europe… and the UK opted out just for a handful of votes. Some recent political elections in Belgium, Israel, Italy (to name a few) basically ended with an equal split between two or three parties. What is the reason for this divide in the electorate?

I’ve always been convinced that information is at the roots of a healthy democracy: to be able to vote and make a decision for the country, we need to know as much as possible about the matter at hand. Will Brexit increase or decrease the life standard in the UK? Will the prices of milk and dairies go up or down? Will there be more jobs, or fewer? It may seem that we have access to much more information today than we had 30 years ago, thanks to social media and all the (free) accessible sources from the Internet. However, is this really beneficial for democracy?

It is not! The large information flow that we receive today from social media and internet channels is a huge threat for democracy!

One of the most shocking truth that came out of the Brexit vote is that the brexiters based the main narrative of the campaign on lies. The common, but largely unjustified, fear that immigrants are “stealing” jobs and impoverishing the locals was used to collect and unify masses of unhappy voters. From talking to people I know, I realised that this conviction was particularly fierce among immigrants that had settled in the UK. Another lie, that was later admitted by the leader of the Brexit party, was the narrative that a lot of money would be saved and reinvested in improving the NHS hospitals. In reality, the EU used to invest a lot in Britain, especially in less developed areas and scientific research. All in all, money was lost. How is it possible that so many people bought this false narrative?

It possible because of echo chambers: groups where people exchange and reaffirm the same belief on the internet, echoing it to each other.

Echo chambers play a crucial role in polarising the population and propagate lies and fake news. Searching information on echo chambers, I found contradicting opinions among scientists and journalists. Some people think that they do not exist, or have small effects. It is true that social media offer a much larger choice of news and information than the traditional printed news, or the telly. But how unbiased are the news fed to us? I myself am a mild user of Facebook. I just use it to keep in touch with my friends and colleagues. Have you ever noticed that FB will show advertisements of the exact company you just searched for on Google? I mean, you search for that company delivering food at home, and for the next 2 weeks you’ll see an ad of the same company popping up on your newsfeed. Every time I booked a hotel on booking.com, I’d see ads promoting that same hotel appearing on FB. It makes me feel that there is a Big Brother watching anything I do and search for, to feed me back exactly the same thing. And this also works for news and opinion sharing, as supported by scientific studies!

A recent paper on Scientific Reports (of the Nature Research group) [1] analysed over a million FB users and patterns of their connections, in particular tracking users engaged in scientific and conspiracy news. It turns out that the said users were extremely polarised, with very limited exchange among them. The mechanisms at the base of this may be challenge avoidance and confirmation bias. In other words, it is better to avoid challenging (or being challenged by) users that have very different opinions. Also, it feels better to read about articles and posts that confirm my own opinion, instead of the ones challenging it. This mechanisms tend to create very polarised groups of users, which do not communicate with each other much. Isn’t this just a natural selection? The problem is that FB allows us to connect with so many people that we have the luxury of selecting who to listen to and who to avoid. Think back of the old times, when people discussed politics and sport results at the local bar, or watched the news on few TV channels. You cannot really choose who is in the bar, and end up discussing with your buddy that supports the same team, and the annoying guys that supports the other team. Sure, a little bias was always possible: you can just choose the bar where all the supporters of your team usually go, or avoid talking to the annoying guys. But the sources of news and opportunity to discuss them was so limited that it was always welcome a little spicy discussion with the opposing view.

Of course, I’m not a supporter of the good ol’ times! I lived this social dynamics only briefly when I was a kid, and will not give up the much wider information we can access today! However, it’s the large amount of information that is both the great advantage and the problem. We can access so much information, from so many sources, that it is impossible to read it all. Even for a single event, so many articles, videos, forums, blogs, newspaper articles, twits, and so on, to read and watch and digest! One way to cope with it, is to let an algorithm decide for us what to read. Really, it’s not us to decide, but the social media we use. Researchers have found that feed algorithms influence a lot the degree of polarisation in the users of the social media [2]. By studying more than 100 million feeds on controversial topics, it was found that social media that have aggressive feed algorithms, like Facebook, produce higher degrees of segregation and polarisation than social media with lesser algorithms, like Reddit.

Convincing, uh? But maybe it’s my own echo chamber to read and trust scientific research, being a researcher myself. Finally it’s our own perception that we need to trust, as long as we keep our mind open… and keep challenging our own convictions. After all, if we are so sure, a little questioning should do no damage!

P.S. Further reading:

[1] E.Brignoli, M.Cinelli, W.Quattrociocchi & A.Scala, “Recursive patterns in online echo chambers”, SciRep 9 (2019) 20118.

[2] M.Cinelli, G. Morales, A.Galeazzi, W.Quattrociocchi & M.Starnini, “The echo chamber effect on social media”, PNAS Vol.118 no.9 (2021).

[3] It may also be interesting to watch the 2020 Netflix Documentary “The social dilemma”, containing interviews to some of the founders of popular social media on the feed algorithm.

--

--

Giacomo Cacciapaglia
7 Star Circus

Senior Researcher at CNRS, France. I work on Theoretical Physics, and applications to epidemiology.