“This Is What I Call The Filter Bubble”

The impact of algorithms

Nicolette De Weerd
Dialogue & Discourse
4 min readAug 2, 2018

--

📷: Unsplash

One of my best friends is working as a journalist for a famous Dutch broadcaster. On a daily basis, she dives deep into a variety of topics, trying to get to the bottom of cases. She makes documentaries. Interviews important speakers and political leaders. She brings news to us, about so many different, relevant topics. But the fact is, she isn’t the only one providing us the news. We live in a world of an information overload. We get bombed by all different kind of “news” and we need to find our way through it because we simply cannot consume it all.

“Luckily” companies as Facebook and Google knew news consumption would become an issue for us. So they have found a solution to our issue, by creating our own online news-gatekeepers. Little robots that are making sure we only see the news we are interested in. Personalised news. How cool is that, you would think.

Last week I had a conversation on this famous ‘Algorithm’ topic. Facebook and Google are examples of companies using algorithms a.k.a. gatekeepers. I was also recommended to watch the TED talk of Eli Pariser. When you hear his story on the impact of algorithms, you would be surprised that this talk was filmed in 2011, 7 years ago. It seems little has changed since then. It may have even got worse. Or, could we say we finally seem to realize the huge impact it has? In our perception. On our democracy. In our lives.

“This is what I call the filter bubble” —Eli Pariser’s famous words explain what happens to us.

Thanks to the algorithm gatekeepers in our online world, Eli tells us, we live in our own personal universe of information. And everyone has a unique version of this universe. Because apparently what I get to see in my google search results or in my newsfeed, is different compared to what you would get to see. Eli Pariser explains that the information you get showed depends on (1) who you are and (2) what you do. But the biggest issue about this is: I nor you can really decide what information or news we get to see. And maybe even worse: we don’t know what information or news we don’t get to see. What gets filtered out (by those gatekeepers). What we miss out on. We have no idea.

What happens, is that most of the time we simply get to see just one side of a story. One perspective. Which makes it extremely hard for us to properly understand what’s going on. Or to judge certain topics. Or to vote on the right candidates. All recent activities such as the election of Trump and Brexit shows the danger of hearing one single story or perspective. Getting one particular view portrayed. And assuming this is the truth. This is only one example of how people become extremely biased.

Algorithms have a huge impact on our perception. On our democracy. On our lives.

As said, my friend is a journalist. She works in an industry that was originated in order to make sure we all get a good flow of information. About all that’s going on in the world. And in our society. They were from the beginning also always acting as a filter, but one with a certain civic responsibility. Making sure we all knew what was going on. Her job is now basically also done online, by these robots. Although both, my journalist friend as well as the algorithm robots, are both news gatekeepers and showing us news, the biggest difference is maybe found in transparency. Where my friend will try to make sure you’ll get exposed to both sides of a story, these algorithms won’t be capable to do so. They are somehow encoded. But we don’t know the encoding program or rules for this filter of information. What are the rules that are defining what gets in and what stays out of our bubble?

Pretty annoying when you think about it, right? But also quite interesting knowing that the internet -or beautifully called World Wide Web- is meant to make the world connected and transparent, however, like Eli mentions, we sometimes seem to be more isolated in our own bubble than we might realize…

Thanks, elipariser :-)

📷: Unsplash

--

--