How Facebook make us build our own golden cage

Guillaume Koechlin
Social Media Writings
4 min readSep 26, 2019

Do you know that sound effect called « echo » when you scream in any cavity with hard walls and the sound just comes back to you in several times ?

Well, this is pretty much the illustration of what happens when you interact with a public post on Facebook.

Our Facebook feed is the room and the clicks, likes, comments on posts are our voice. And the signal we send to to the Facebook algorithm comes back even louder.

This concept is named « Filter Bubbles ». What is it exactly ?

To select the contents that will be put forward on a user’s Facebook feed, Facebook’s algorithm uses all the browsing data provided by the user on its feed. It looks at the kind of content the user interacted with and from that, it offers « relevant » contents that the user will be more likely to interact with. Want to hear more about football news ? Just click, like and comment every football content on your feed every day for one week and you will be drowned in football news for a moment.

So is it just a simple and innocuous recommendation engine like Spotify’s or Netflix’s ?

On the principle, yes. Practically, it is slightly different. And this is because Facebook is a media whereas Spotify and Netflix are entertainment platforms. This is not about what people like, it is about what people think.

We spend on average one hour a day on Facebook. And most of this time is involved in scrolling our feed and sometimes interacting with the contents. And what is different from a standard news platform is that the news come to us instead of our coming up to the news. A feed has no structure and is only built on a ranking. The user cannot navigate into categories and has no « feed search bar ». The only thing he can do is scrolling down and receive the informations as it comes. We are free to choose the contents we want to focus on in our feed but we are more likely to spend time on the most accessible contents which are the contents on top of the feed. Therefore, it is possible for Facebook to show us whatever they want. This means in a way that we are not free to choose the content we want to go through.

But do we feel bad about it ? Not at all. We like it. And that’s why we spend so much time on our Facebook feed. Facebook is really good at knowing what thrills us and what bores us.

But the problem watching the same kind of contents and contents we like all the time is that we tend to become narrow-minded. Being only exposed to content that fits our ideas and interests affect our critical judgement and open-mindedness. We always find more relevant and reliable items that we like or agree with. And we agree even more with our initial thoughts when it is confirmed by another source. This is called the « confirmation bias » effect.

The confirmation bias effect is : « The tendency people have to embrace information that supports their beliefs and reject information that contradicts them ».

Applied to Facebook, it looks frightening : People interact with content they like/agree with, which first reinforce their ideas and second leads the Facebook algorithm to put forward similar content that will reinforce their ideas even more. The result is that we tend to get a narrower information flow, develop set ideas and be excessively distrustful towards different information. This is more or less radicalization. And the biggest problem is that we don’t realize what is going on because Facebook call the shots. When you go to a newspaper kiosk, all the newspapers are presented to you in the same way and you consciously choose your newspaper. When you open your Facebook feed, you just take what comes first without really caring about the source, as long as the contents look interesting.

What do people think about this ? Well, if you ask someone used to scroll his Facebook feed every day, he may not grant much credit to this theory. If you ask someone from the outside, he may never create a Facebook account after becoming aware of this. In fact, if you are inside the bubble, you don’t see anything excepted the nice inner side of the bubble which is the reflection of your own ideas and interests. Watched from outside, it looks more like a jail than a fancy world. Filter bubbles become filter prisons. This is just a matter of perception.

What should we do about this ? Being aware is a great achievement already. Being careful when coming up to the news on the Facebook feed, since it is not a neutral sample of news reporting. Being also careful to what we interact with : For instance, clicking on buzz contents that are likely to report distorted news will generate more of it.

But a moderated and hindsight use of its feed is actually not unhealthy ! We still find out interesting and enriching content on the Facebook feed. Key is just to understand why we are so addictive to our feed, how we fuel this addiction and what is the effect on our mind.

--

--

Social Media Writings
Social Media Writings

Published in Social Media Writings

This is the forum for writings about social media phenomena. The texts are part of Social Media course at Aalto University. Some of the texts are anonymous.