The Dangers of Filter Bubbles and Echo Chambers

Darius Mees
Writing in the Media
6 min readMar 16, 2021
© Magnus Müller via pexels.com

Joseph, 37 years old, was born into a white family with several other siblings. He has a wife and two kids, living in the rural areas of Alabama with a low percentage of any non-white population. He has a low-wage job and regularly struggles to make ends meet. He only watches FOX News because all the other channels are ‘communist lib-tards’. He’s blaming Obama and immigrants for his misfortune, supports the NRA, and thinks abortion is a sin. In his direct neighbourhood, he is fairly liked. Going out for a drink with his friends, talking about politics and sports. They all share the same ideas, interests, political stands.

One day, he was wondering.

“Everybody here hates immigrants. Why are the democrats in power?”

Nobody supports democrats in his area. A colleague told him about the time a Mexican stole his job. And that all Asians transmit the Coronavirus. And still, FOX News reports the victory of Joe Biden.

“Everybody is stupid but me.”

Joseph is living in a real-life echo chamber. He is only interacting with people sharing his way of thinking, only consuming news stories of his ‘trusted’ source.

The same can be applied to the virtual world.

The rise of Web 2.0 has brought us many platforms on which we can socially engage with others. Facebook, YouTube, Instagram, Twitter. The list goes on and on, and everybody has their own favourites.

© Caio via pexels.com

Do you want to play a game in front of your friends?

Try Twitch.

Do you want to become a part of a community for a specific niche topic?

There definitely is a subreddit for that.

Do you just wanna share your thoughts and inspirational photos?

Tumblr might be the right thing for you.

Social media platforms are great to find a place you fit in. A place, where you can talk about stuff with people who share your interests. You are not only going to add real-life friends to your list but also make new friends online. Surrounding yourself with like-minded persons can help you finding your confidence or feeling validated. Though, getting too deep into virtual reality and building up barriers can be very hurtful — and dangerous!

© ThisIsEngineering via pexels.com

That’s where filter bubbles and echo chambers come into play. In 2011, Internet activist Eli Pariser coined the term ‘filter bubble’ as Internet companies and services using personalised algorithms, influenced by the users’ preferences to extend the users’ experience.

Algorithms can be compared to virtual brain cells. They learn from our behaviour. They can adapt and complement our search requests. They also learn what we prefer. A person with a left political stand is more likely to receive search results including left-oriented news sources. Vice versa, a far-right person would get articles from populistic and far-right sources.

Filter bubbles can also contain popular search terms and interests of users. Because the algorithm is just a human-created digital data function, it can’t distinguish who is actually sitting in front of a screen.

For example: If you read up on ‘pregnancy’ for a friend of yours, Google and co. will link this search to your virtual profile. Sooner than later, you will receive recommendations of products aimed at infants without even being pregnant yourself.

While the latter is rather harmless and would maybe just bring you in some awkward situations, political filter bubbles and echo chambers can be dangerous for our society. Although Pariser uses those two terms interchangeably, there can be made a simple distinction:

Filter bubbles are defined by computer-based filtering out the news we don’t like or are less interested in. Thus, some important information can be left out and our knowledge is narrowed down.

Echo chambers, on the other hand, is the product of us being overflooded with news we personally like or agree on. In that way, our perception of the real world is disrupted because we don’t learn about other sides of a story. Muting certain news companies on your Twitter feed can be one of many ways to create a chamber.

So, Filter bubbles are a result of automated filtering whilst echo chambers can be created through other processes — like voluntary isolation.

© JJ Whitley via pexels.com

This is especially dangerous with alternative ideologies and conspiracy theories.

Going back to Joseph, he decided to create an account on Twitter ‘Pr0ud_boy88’. He immediately follows Donald Trump, QAnon, and ‘Alabama4Guns’. He accesses the FOX news page, leaving a comment.

“Somebody should teach those traitors a lesson!”

The most recent case of isolated online communities having an impact on real-life events is the 2021 storming of the United States Capitol. Stirred up by Donald Trump and his party’s claims that the 2021 election was rigged, and Trump never lost, hundreds of far-right Trump supporters and conspiracy theorists attacked the Capitol.

Trump frequently used Twitter for his campaign and after being elected president, spreading fake news to his millions of followers online. Furthermore, during the final stages of the election, he emphasised the baseless claim that the election had been tampered with and demanded a recount.

Many of his supporters, chanting the very same words of their president and associating themselves with well-known conspiracy theory organisations like QAnon, believed every word he was saying — or tweeting. If they don’t see enough critical media content, or even deny it, there is no way to persuade them otherwise.

While most of this action can be explained by deep-rooted racist structures in the US, we can’t neglect the influence of filter bubbles and echo chambers.

© Andrew Neel via pexels.com

Once again, Joseph is checking his updates. Hundreds of likes and supporting reactions to his comment. Everybody on the Internet seems to share the same opinion. The election is a fraud. Democrats have conspired against America. Only the white supremacist can save their nation.

Because of the things Joseph wanted to see — articles supporting his opinion — he lost perception of reality. Every time he typed ‘US election’ into the Google search bar, he received suggestions like ‘rigged’, ‘fraud’, and ‘conspiracy’. When I use the same search term, Google comes up with ‘results’ and ‘Georgia’.

The algorithm differs from person to person — digital embodiment to digital embodiment.

Joseph only differs between black and white. Bad and Good. Them versus us. People who don’t share the same opinion are his enemies.

That’s why we have to keep the disadvantageous side of the Internet in mind. Automated algorithms can influence our perception of reality and how we build our opinions.

We always have to fact check and look for other sources as well.

Further readings on this topic:

Kristen Allred wrote a similar article on Medium years before, it’s going more into detail and is worth a read!

German agency Reuter’s about filter bubbles and echo chambers.

--

--