The Dark Side of Facebook?
You’ve probably seen lots of talk about the Trending Topics box on Facebook over the past week or so. This is the little box in the top right of your Facebook feed — very valuable screen real estate that is displayed prominently every time you log into Facebook. Here’s why it’s been in the news and what the implications are.
The root of the issue lies in how exactly these ‘trending topics’ are deemed to be trending. You might have presumed these were selected by an algorithm — most people did — but our understanding is that it is actually done by a team of in-house “news curators” who hold the power over which stories make the cut to this section. These in-house journalists will essentially look at what is trending on Facebook organically, make a judgement on what deserves to be a ‘Trending Topic’, and write a headline for them. Crucially, these curators have the ability to blacklist certain pieces of news from ‘trending’.
Obviously these people therefore hold a lot of power — they get to choose what news is served to Facebook users in this highly prominent section of the site. They get to set the agenda.
This all flared up last week with a Gizmodo report from a former ‘news curator’ alleging that they routinely suppressed Conservative news. The report claims that the curation team would often prevent right-wing stories from appearing in the section, even though they were organically trending amongst Facebook users. Other curators interviewed disputed this claim but they all agreed that they did have the power to insert stories into the trending list even if they were not actually trending organically.
This revelation has really hit home with lots of people — why should a ‘trending’ box be judged by humans who will have a natural bias, especially when these decisions might potentially have been influenced by a political agenda?
A Human Touch
This controversy matters so much because we presumed that Trending Topics were selected by an unbiased algorithm, but we now know that it was human judgement. The issue here is that people view Facebook as an unbiased infrastructure, and these allegations are suggesting that it has an editorial agenda.
Remember, it isn’t just Facebook that have moved into this realm of news curation. Twitter have done it with Twitter Moments. Snapchat with Discover. Apple with Apple News. (Not to mention the thousands of newspapers and media companies who have made these sorts of editorial decisions since time immemorial…) These are all attempts to add a human touch to the decision about what constitutes news and what doesn’t.
So here lies big question #1:
how can a human team of news curators possibly remain unbiased?
Surely everyone is biased in some way because of who they are. Everyone naturally interprets ‘news’ differently.
This story has prompted people to question a lot more about social media and its potential bias — and if this is actually an important issue or not.
The Echo Chamber
Take this debate up one level, and look at the theory of social media as an ‘echo chamber’. This suggests that social media just echoes your existing perceptions about the world. This is because you are naturally going to connect with people who align themselves with how you think. You’re going to follow journalists that align themselves with how you interpret news. You’re going to interact with brands that align themselves with your lifestyle.
All of this combined means that the content you see on your feed is going to reinforce your existing beliefs as opposed to challenging them with new ones. We know that the news feed algorithm shows you things it believes are relevant to you, based on what you have interacted with in the past— merely reinforcing this concept further.
We have arrived at big question #2:
Is it a problem if social media reinforces our pre-existing beliefs?
The echo chamber theory suggests that it’s not just humans that are biased — algorithms are also inherently biased because they show us what we love to see.
This could be seen as a counter-argument to the backlash against curated news. It might be a good thing that there are humans choosing which news to serve you, because this disrupts the echo chamber cycle.
There is going to be a lot of pressure on Facebook to offer more clarity here and ensure neutrality as far as possible in their trending topics section. They have already launched an internal investigation into what happened here and Zuckerberg has scheduled meetings with leading Conservatives to hear their views on the issue. This is such a massive deal for Facebook because a perception of bias is incredibly bad for business, and they have to be seen to be taking this very seriously.
We’ll see how they deal with this over the next couple of months. If anything, a wider debate about the merits of human curation as opposed to automatic algorithms will be an interesting one to watch unfold.