The Facebook echo chamber is real…
On Facebook, you are never more than just a few clicks away from silencing views you disagree with. During the gritty US election campaign, more often than not, ignorance was bliss.
Click that mute button, and an algorithm amplifies the effect in an instant. Press a few more times and offending individuals and disagreeable posts are largely banished from your News Feed altogether.
As a YouTube vlogger with a keen eye on my viewer statistics, I am firmly aware of how its algorithmic ecosystem benefits me as a content creator. A very large proportion of my viewers come to my channel as it has been suggested to them by YouTube.
The more I produce content that is optimised, the more my videos are being suggested to other likeminded people by YouTube’s algorithms. My vlog content is rarely political and is mostly uncontroversial so, in the most part, that’s probably OK. Most won’t mind being suggested to my channel because they are probably interested in the same things I am, making this kind of echo chamber pretty healthy.
A lot of online/social media companies use a similar algorithm for suggested content, which serves two distinct purposes. First is to keep attention on-platform. Suggesting content that people will like is the most effective way to do this. YouTube wants to keep people on-platform by getting them to watch more of the videos they want to watch, as I said above. And right here on Medium too, its algorithms are designed to get you to spend more time reading articles. In the retail shopping space, Amazon is set up to show you more of the products you will be more likely to buy (so they make more sales). Whether it’s to get you to consume content or make purchases, online and social media brands want you to stay with them for a greater amount of time.
The second reason they suggest content is genuine user experience. Suggesting relevant content is not always designed simply to make more money from you, but to provide you with a better experience on a platform through the suggesting of content you will enjoy. Again, this is no bad thing. For digital marketers and content creators, it gives us a pitch to play on.
None of this is necessarily bad, it’s just modern-day business. If people are interested in travel, for example, YouTube may well suggest them to my 6-part video series on the marvels of adventuring to Iceland, for example. Harmless stuff.
Skip forward to the 2016 US presidential election, and this seemingly happy relationship between social networks’ algorithms and the attention of users is suddenly in focus. Remember one important point: most adults in the US now get the majority of their news intake on Facebook. By default, this also expands to political and election coverage. Now we have a context in which algorithms are deciding what political news people will see.
As I’ve noticed with individuals in my family, the political echo chamber can be a very deep, dark rabbit hole to fall down indeed. In a matter of hours — a few liked posts, a few mutes — and the vitriol becomes fiercer and more one sided. I’ve watched how people get so quickly sucked down into a one-sided, angry tirade with their laptops. I don’t want to exonerate people from responsibility because individuals believe in the things they want to believe in, but what social media platforms give them is a place where no other views will come to challenge what they see of the word if that’s what they want. All it takes is a few clicks.
Mark Zuckerberg raced out to defend Facebook following suggestions that his platform was to blame for the election result. Of course it wasn’t. but I don’t like the way he deflected the issue by saying “fake” news didn’t win the election. This misses the point. Fake news from both sides, of course, circulated freely during proceedings, but it’s the algorithms design that shows people what they want to see that should now be debated in an honest way.
Even me. I am the proud owner of a cleansed and perfect (from my point of view) Facebook News feed. I mute, block and delete abhorrent views mercilessly. Oh, and there have been many opportunities over the last year. Mute, block, delete. Click, click, click.
My feed now shows a perfect world because I have instructed Facebook that it’s the world I want to see. At our own individual levels, this can indeed be described as bliss, but the bigger picture is more frightening.
Blocking out the other side that — as if we needed election results to show — exists in abundance is perhaps unhealthy when we consider that social media is the new BBC, CNN, Fox, CNBC, New York Times etc. Ignorance is not the answer.