Popping the Filter Bubble with AsapTHOUGHT

Megan Elias
RTA902 (Social Media)
3 min readJan 24, 2017

RESPONSE TO BLOG PROMPT #3: SHOULD SOCIAL MEDIA COMPANIES BURST FILTER BUBBLES?

Last week’s lecture centred around the algorithm used in social media that causes a “filter bubble”, a phrase used to describe the invisible algorithmic editing of the web. This filter bubble works by using the content you have previously interacted with on social media (photos, tweets, links, ideas) and giving you more of the same type of stuff while eliminating content that is contrary and likely undesirable. For this reason, the news articles that appear on your Facebook timeline may look very different from those which appear on your neighbour’s.

Social media companies use this algorithm to tailor each individual’s feed according to their interests, opinions, and political views. They analyze what you like, and give you more similar content so that you keep going back. The filter bubble can be a great tool in providing users with favourable content, but it also has a dark side.

Mitch and Greg, the Torontonian duo behind ASAP Science and ASAP Thought, uploaded a Youtube video a few weeks ago called “Did Facebook Elect Trump?”. This video discusses the reason for the huge spike in fake news articles that circulated Facebook during the months leading up to the US election, and how this fake news may have influenced the resulting outcome that occurred. It explains how fake news articles encourage this filter bubble, causing people to think that the news they see is the only news, while simultaneously being blind to the all the information on the web that expresses opposite views.

Here is a snippet from 2:25–2:40 of the video:

“When we go on Facebook, we’re seeing the same types of content that’s catered to the way that we already think. It doesn’t allow us to be challenged and understand diverse minds… this is really dangerous when it comes to deciding what our ideologies are” (Greg Brown, AsapTHOUGHT).

In these few sentences Greg perfectly explains the downfalls of the filter bubble algorithm, and why it’s important to be aware of this filter so that we don’t become ignorant to those who’s ideas conflict with our own.

In the weeks heading up to the election, I had no doubt that Trump would lose. Everyone in my community and social circle were anti-Trump; all the news articles, statuses, memes, and viral videos were anti-Trump. It didn’t seem fathomable to me that there was this whole other world where Trump was seen as a favourable president of the United States. It seemed unimaginable. This was because my web browsing sessions were being tailored to me. I clicked on anti-Trump news articles (much of which were probably fake), and thus anti-Trump content was shown in my newsfeed. My friends had the same opinions as me, because well, they’re my friends. I was completely unaware that there were so many people in the US who loved Trump, because I simply wasn’t being exposed to them.

To answer the original question, I would have to say no. I do not think social media companies have a moral obligation to burst filter bubbles. Instead, I think the public has a moral obligation to be aware of these filters and be weary of the fact that there is so much content out there that is intentionally being hidden from us. Social media companies are private; they are not affiliated with the government, at least not officially. Their main goal is to make money, and if they choose to do so by capitalizing on our ignorance, that’s our own fault.

On a more positive note, knowing that the filter bubble exists will be extremely helpful the next time the world is seemingly split in two. If more and more people are made aware of this algorithm, we can start being more open minded to the reality that Facebook, Twitter, and even Google’s best interests are not to give us the full, unbiased truth. With this knowledge we can actively search for content contrary to our own ideologies, and find material that would not have been provided to us otherwise. And if we do this, maybe the 2020 election results will have a more favourable outcome.

--

--