The Echo Chamber

Cristobal Bozdogan
urbes
Published in
3 min readNov 27, 2016

Our own little world.

In the real world we are always bombarded with media and news from a broad political spectrum. But when it comes to our friends we tend to pick them on the basis of their character and their values but those characteristics also underline their political ideologies. When having a drink with friends people tend to discuss politics and fall within the same political ideologies. Think about it. How many friends do you have that voted for someone on the other side of the political aisle.

This is dangerous. That means that our views are magnified and tend to become more and more extreme as we have nobody around us to challenge them. If everyone agrees with what you say, why not go a bit further?

In the past, we consumed media and news from relatively trustworthy sources that were had differing viewpoints. Our views were frequently challenged. Having just one or two TV channels met that we had to be influenced from differing point of views.

Today we tend to consume media from social media. We devour the stories that our friends hare online. We like them, we share them, and we don't bother to check the factuality of it. We let ourselves be completely immersed in a bubble of affirmative media that confirms our view points. We are right. They are wrong.

This isn't just a sensation. A recent study has shown that friends tend to share news from sources that confirmed their own point of view. We also tend to have friends that share the same viewpoint. In addition to that, the Facebook algorithm picks the posts that you tend to click on. So in in the rare occasion that a friend shares a post or article from an opposing ideology, the algorithm tends to skip that news you don't see it. And when you do see it, you tend not to click on it.

If your newsfeed contained content from random Facebook users, just under half of the content would be of an opposing view point. However the ratio of opposing content shared by friends, shown by the algorithm, and clicked on by the user goes down in each step. Only 20% of content clicked on by the user is actually from an opposing ideology.

While it would be easy to blame the algorithm for this drop in exposure, it's not that simple. The biggest drop in comes from the "shared by friends" category. As mentioned above our circles are ideologically defined. We like to have friends that think like us. Facebook has no control on who we can pick as friends.

However, the study should be taken with quite a pinch of salt. You see, it was conducted by Facebook itself. What better way of absolving itself from this thorny issue than blaming our own friends? Facebook knows that this is an issue and that is becoming increasingly worrisome.

Nevertheless Facebook isn't doing enough. There isn't any valid reason why they can't tune the algorithm to increase the views of opposing content. Additionally why not a simple addition to the newsfeed so that users can see what is popular in all of Facebook and not just within their circle of friends.

As social media becomes an increasingly large part of our lives, it's important for large social media companies to take responsibility in the content that they host. A few years ago Facebook wanted to become the largest source of information in the world. It has succeed. Now it needs to own up to it.

--

--