Facebook, Stop Feeding Us What We Want to Read.

Peggy Choi
The Coffeelicious
Published in
4 min readNov 11, 2016

Never before has my Facebook been as heated and dogmatic as during Trump v. Hillary.

On election night, Stephen Colbert queried, “How did our politics get so poisonous?” Is it, as Colbert claims, because we “overdosed” on the poison this year, out of some strange enjoyment of that “gentle high” that comes along with condemning the opposing side?

An alarming number of statistics show that we are increasingly turning our backs on traditional newspapers, and instead relying on social media for our news. A 2016 PEW Research study found that 62% of American adults now get news through social media, regardless of the potential bias of algorithms.

For those that are unfamiliar, Facebook uses an algorithm for your newsfeed that focuses on a number of variables like:

  • How often you interact with the friend, Page, or public figure who posed the story.
  • The number of ‘likes’, shares and comments a post received from the world at large, and from your friends in particular.
  • How much you have interacted with this type of post in the past (both format and content)?

This algorithm scans and collects everything you’ve posted in the past week by each of your friends, everyone you follow, each group you belong to, and every Facebook page you have ever liked. The result of this automated process is your newsfeed, i.e. the place that, according to PEW, 44% of Americans now get their news from. Incredibly, this newsfeed shapes the social lives and reading habits of at least one fifth of the world’s adult population.

However, your social media is likely to be an echo chamber of stories shared by likeminded individuals with similar backgrounds, tastes and opinions, often serving solely to further insulate us in online ‘filter bubbles’.

My newsfeed during the run-up to Brexit, for example, was predominately made up of members of the ‘Bremain’ camp (as I recall, there were only one or two people in my network who publically supported Brexit). My friends’ opinions on why we should ‘obviously’ stay were echoed back to them by their various school friends, colleagues and extended family members, thus reinforcing their individual belief systems.

Those that argued back tended to be peripheral friends-of-friends, who were quickly denounced as being either ‘stupid’ or actively ‘trolling’ (a person who starts arguments with the deliberate intention of provoking readers into have an emotional response).

Statistics show, however, that professionals with university degrees are far more likely to vote remain, whilst those of the working class (UKIP’s main supporters), staunchly advocate exiting the EU.

Now, I’ll freely admit that my newsfeed largely ascribes to the former, with many of my English friends hailing from London and the Southeast, so my network’s allegiance to the ‘Bremain’ camp is hardly surprising.

Newsfeeds, as they stand today, therefore, aren’t so much a discussion, as a one-sided discourse. These ‘filter bubbles’ simply bolster our own present world-views; world-views which are accordingly portrayed to be more universally accepted than they may actually be, due to the lack of challenging viewpoints. The fact that my newsfeed is almost entirely pro-Hillary, anti-Trump, says more about my network than it does of the electorate. Clearly, my friend’s dad in Ohio has a very different looking feed than I do, polar opposite in fact, and every article and post he sees serves to further reinforce his political stance, just as my newsfeed does mine.

Facebook, and even Google, actively ‘curate’ feeds that they then feed to us. As a result, the extremes become more extreme as they read and consume thoughts and opinions that support their own.

Remember that “poison” Colbert spoke of? That hate? Well, that hate is heightened by the ways in which many sites are currently curating their newsfeeds.

The key point I want to ask, therefore, is —
are we doing “curation” right?

For information and knowledge, shouldn’t curation result in the showcasing of multiple viewpoints from different sides, instead of just feeding us the most “relevant” information that we want to hear? Isn’t knowledge and wisdom all about exposure to different ideas, thoughts, and notions, so that one can truly have a balanced view, and actually develop empathy?

If our newsfeed displayed a greater variety of opinions, would one be as staunchly pro-Trump or pro-Hillary? Would we continue to denounce the other side with one hundred per cent conviction, or would the more rounded distribution of information plant a seed of doubt in our certitude? Shouldn’t we always be critical of our opinions and beliefs? Shouldn’t we strive to constantly challenge our ‘truths’ by uncovering assumptions, and distinguishing between what we ‘know’ from what we don’t know?

People are more complicated than mathematical models and algorithms. Facebook and Google etc., need to deeply question their approaches to content curation — they are great companies with both the privilege and position to have an impact on how people think and act, as well as help people to understand each other — to empathize.

Shouldn’t this be their duty to the world?

--

--

Peggy Choi
The Coffeelicious

Founder & CEO of Lynk, an intelligent curation platform of experts http://lynk.global