I can tell how you’ll vote based on your social media feed
Bubble bubble toil and trouble. It seems the world is becoming ever-increasingly politically polarised. Is the social media ‘filter bubble’ to blame?
The changing digital landscape has meant a change in many aspects of our lives. A major part of this is the evolution of the internet, as we discussed in Topic A. Amongst the changes that were highlighted during the session was the move from traditional TV and music consumption to streaming services, as well as the changing space of social media. One aspect we only lightly touched upon was the way we receive and digest our news and information. This article highlighted how access to the wider world has become a lot easier since the internet was conceived.
Around 55% of adults now access news online as opposed to old-fashioned printed papers. News is becoming more instantaneous and also more tailored, we have essentially become our own editors, choosing what we click on and what we see. Or so we think. In 2011, Eli Pariser wrote a book about digital algorithms and a term he coined ‘The Filter Bubble’. He also delivered a TED talk about it:
Pariser accuses Facebook and Google of an “invisible algorithmic editing of the web”, showing what we want, and not what we need. Does this ‘filter bubble’ really exist? If so, do they have an impact outside of the digital world?
Blown Out of Proportion
Okay, so social media platforms show us information and articles that only we like. It’s only for our benefit right? What harm can that do? Well, according to this wired.com article it can do a whole lot. It suggests that political media releases we’re exposed online, translates to the offline world in our votes. This is not unreasonable, the attitudes we form, no matter how and where they’re formed, influence our behaviours. With the general election coming up (regardless of how much we don’t want it) it’s an important issue to discuss.
But it seems that the magnitude of the effect of the ‘filter bubbles’ is under debate and some disagree over their very existence. A study by the National Bureau of Economic Research reveal data that show that social media does not increase the likelihood of polarised political views. In fact, the demographic that is least likely to use social media, those aged 65 or older and 75 or older, was the group with the most polarised views. On the one hand, this could be taken to mean that social media clearly has little effect on our political views.
Pariser, the brain behind the ‘filter bubble effect’, argues against such findings in the above article. He talks about another study that argues individual choices and friends are more important than the algorithms. Even Pariser admits that the effect size of the ‘filter bubble’ phenomena is smaller than what first impressions suggest, but still scientifically significant all the same. Putting the fact Facebook were behind the study aside, it’s true our initial choices are important, algorithms need to learn from somewhere. But the ‘bubble effect’ is the perpetuating of these choices; it forces us to make the same choices because we’re only ever shown select options.
Behind the Bubble
Social media ‘bubbles’ are an example of what Psychologists like to call Confirmation Bias. Here’s a little more information about it:
Imagine that you have tried to reach a friend (with whom you have an ambivalent relationship) by phone (or email)…www.psychologytoday.com
It’s that human tendency to pick and choose the information you take in based on whether it fits with your already formed picture of the given topic. It is no new concept, and offline consequences have been studied widely. What is relatively new is the development of new digital ways of communication. It now seems even easier to pick and choose what we see. On Facebook, if you don’t like someone’s opinions, you click ‘unfollow’. Isolation is just a few clicks away. Perhaps this is reason there is an impression that social media has enhanced the polarisation of political views.
A writer returns to his hometown — once isolated, now connected — to conduct an experiment and answer the unanswerablebackchannel.com
Going back to Topic A of the Digital Society module, the internet is a new thing, but society is not. Sorgatz’s exploration (above) of his isolated hometown drew up less differences than he expected. I think this is reflective of the fact that behind all the ones and zeroes are humans. A digital society is not so different to the one we had before the internet. It’s just faster, bigger and has more power to do things because of it. This inevitably comes with a new responsibility and the need to adapt the way we navigate such a society. This is why I think an awareness of the ‘filter bubble’ is needed, but also caution in the responsibility we place on it. As Pariser also highlights, an awareness of how algorithms work allows us to be more conscious of our online interactions and attitudes.
Bursting the Bubble
In light of this, how do we burst the bubble? This TED talk playlist is a brilliant place to start. The article below also gives some tips how we can improve social media usage and broaden our horizons.
In the wake of the US election, concerns are surfacing over the filter bubbles that mediate the information people see…www.newscientist.com
It’s our own responsibility as individuals of the internet age to ensure that we use social media with awareness of its pitfalls. We need to be aware that what we read and do online, affect the world offline. This can be both negative and positive, and how we use the internet determines which side it’ll be.
“Empathy and respect”
Robb Willer (below), a Social Psychologist, carried out research to figure out a way for opposing political sides to converse. He describes the different moral underpinnings of either side, and how the language we use in conversations and persuasive pieces can affect one’s perspectives on it. Exposing ourselves to the other side doesn’t have to be aggravating and painful if we all remember to do so with empathy and respect. Empathy for the underlying morals that lead to others’ beliefs, and respect for their choices.
My digisoc Story
I chose this module because I wanted learn more about the growing digital world, where businesses seem to be focusing more and more of their attention. Another thing that excited me about the module was using Medium. It’s something I was already fairly familiar with, but also something I wish I used more. I saw this module as an opportunity to start building my profile further.
I came into the first session of Digital Society with a pretty optimistic view of technology and the digital world. Blame it on have two software engineers and tech enthusiasts as parents. And many aspects of this course definitely added that optimism. For example, when Experiencing New Technology and visiting DigiLab with their array of smart technology was quite inspiring. It lead to me think about how we can use virtual reality technology not just in games but also to aid human life. This is something I discussed further in my digisoc2 presentation; can virtual reality help those struggling with mental health difficulties?
This is just one example of how this module has broadened my knowledge of what a ‘digital society’ is, and could mean, in the future. This particular assessment also broadened my communication skills. It was the first time I had come across the Pecha Kucha style of presenting. Explaining a fully formed idea in such a concise way is a difficult skill which I’m glad I got to practice.
A more challenging experience of this module came along when discussing Topic E: “The Individual”. As I’ve highlighted through my analysis of social media filters and bubbles, confronting opposing ideas and thoughts to your own can be difficult and uncomfortable. I found this week challenging because of this. Although it was a very interesting session, it also made me think in ways I didn’t not expect to. I didn’t want to acknowledge the dangers of ‘too much’ technology and the dark side of the internet, because in all honesty, I rely on it too much.
We all ideally want the internet to be a safe space of freedom of opinion, but getting that balance between freedom and monitoring is a challenge that we all have to work towards as a digital society. By the end of the session, however, I was thankful of the experience. It served as a reminder of how powerful the digital world can be, for both good and evil. I think this awareness is the first step in finding that balance.
Ultimately, I think I achieved the goals I set out before the module. It’s definitely opened up ways of thinking about the digital society that I had not previously done so. In doing so, I’ve also developed my communication skills and critical analysis. Despite being encouraged to think critically about the future of the digital society, the module has made me want to dive deeper and learn more. I’d like to learn how to code, and continue blogging on Medium.