Facebook: Error 404 — Trustworthy News Not Found

Benjamin Barteder
Content Mines
Published in
6 min readFeb 8, 2018

So, Facebook is overhauling its news feed again. One cornerstone: Making trusted news sources more visible. With the wrong approach.

Last Tuesday, I was at uni with my fellow students, participating in a discussion on how Facebook is reworking its newsfeed. While I believe that a lot of Facebook’s arguments make sense in Adam Mosseri’s (head of news feed) statement, one part of it bothered me a lot: The quest to ensure the news we consume on Facebook come from so-called “trusted sources”.

Quantity Is Key

Let’s face it: Facebook has changed the way we consume news and gather information. We scroll through our feed, clicking on an article here, watching a funny video there, commenting or liking this and that post … We’re meandering through our feed, spending more and more time scrolling, taking in lots of different thoughts and ideas. Facebook values quantity, as Tara Hunt points out. That means Facebook wants you to spend more time, give more likes, leave more comments, share more often … you name it.

So logically, Facebook will always work optimize towards exactly these metrics to make sure its advertising platform stays profitable. And that would be all fine and well, if only the world was a merry place where everything we see is what we get, true and meant to do no harm.

The News Are Fake (Sometimes)

The reality is that algorithms that value quantity can be gamed. As long as you trigger enough engagement, you can and will get your message across on Facebook, no matter how inflammatory your posts are, how hateful your language is, or if you’re straight up lying. The articles on this problem are already out there, I am not going to write about stuff like how Russia meddled with the 2016 U.S. election through ads on Facebook.

This post is about how Facebook wants to tackle all of this with an international survey that was announced at the beginning of January 2018.

Facebook’s Survey Is Reductive

To recap, here’s what kind of news Facebook wants to prioritize:

  • News from publications that the community rates as trustworthy
  • News that people find informative
  • News that is relevant to people’s local community

I’m going to zoom in on the term trustworthy. Essentially, Facebook will run a survey among users to determine this trustworthiness. The problem is, the survey does not ask about trustworthiness, but about subjective trust. As it so happens, Buzzfeed published this survey just recently:

Do you recognize the following websites

  • Yes
  • No

How much do you trust each of these domains?

  • Entirely
  • A lot
  • Somewhat
  • Barely
  • Not at all

Two questions. That’s it. Trust a domain to do what? Why are you not asking me why I trust this domain? Also, you’re not asking about trustworthiness, you’re asking about my subjective feeling of trust, why is that? I got so many questions.

Why does Facebook ask that? Well, outlets that rank well on this survey will most likely be more visible in your feed, the ones that don’t will probably see a decline in their reach numbers. Combine this with professional journalists as fact checkers (announced in 2016) and tadaa, the problem is fixed, right?

No. I’m sorry, but if Facebook hopes to fix the spread of fake news on their platform with this, the company could not be more wrong. Firstly, because the fact-checking effort is failing. Secondly, because this survey misses the mark completely.

Trust Is Comfortable

Why, you may ask? Because frankly, it does not matter if one person reads the New York Times and thinks he or she can trust this source, when at the same time, another person thinks Breitbart is a great source to get balanced reporting on politics from.

“People trust other people and things for all sorts of bad reasons to do all sorts of bad things.” — Russel Hardin

Trusting someone is subjective, it might be based on personal experience, relationships, your own motives. Simply trusting a news outlet does not make its articles true, it does not prove that the research was thorough. “Trust” is what someone feels, to quote Adam Rogers here.

yeah, right.

Repeat that: Trust is what someone feels. Adam Rogers explains it quite well in his article: People may disbelieve an expert but trust a person close to them. We might trust someone because it underscores our confirmation-bias. Because it’s comfortable. Does that mean that simply because we trust, we got the facts straight? Rather not.

Trust ≠ Trustworthiness

What dividing accurate reporting from wrongful reporting is about is how trustworthy a news outlet is. Insightful reporting should aim to investigate, challenge, be transparent about the writer’s point of view and give context. Journalists need to be uncomfortable and maybe even anger people sometimes to present facts.

“… journalism isn’t supposed to reaffirm world views.” — Adam Rogers

Journalism is not about whether Tom or Karen trust a certain news outlet based on their very own experience. It is not about wether or not they trust because they are familiar with a certain source because — oh snap — they have seen posts of that outlet more frequently due to Facebook’s algorithm adjusting to their specific behavior. Ever heard of echo chambers? Facebook, I thought we’ve been through this already?

hm :-)

Journalism is about facts. About considering more than one side of a story. About finding out the truth. About being objectively trustworthy, because conclusions are based on thorough research. About being honest and correcting mistakes when they happen. About explaining the work journalists do. And maybe this last part is something that needs to be improved to earn the trust that Facebook seems to be looking for.

Facebook May Not Be Looking For Trustworthy News

Or if it is, this survey won’t cut it. This survey will not eliminate the problem of fake news being spread on their platform. In the best case it may accomplish nothing, in the worst case disseminators of “alternative facts” (I hate to use this term here) may benefit from it because they may be more widely known in their respective echo chambers.

Why so pessimistic? According to the 2018 Edelman Trust Barometer Global Report, media is the least trusted institution worldwide, 63 percent of the people surveyed state they cannot even tell the difference between good journalism and wrongful reporting. The report even characterizes 2018 as the year of “the battle for truth”. So why would Facebook throw the task of deciding what’s “good and bad journalism” at the community with a two question survey?

Even Facebook is considered part of the media sphere today. As such, users may not even believe Facebook to always tell the truth. And so the social media giant emphasizes again: “First and foremost, we’re a tech company, not a media company.”

Truth is, Facebook itself needs to regain the users’ trust. That won’t work though with fake news on their own platform. And it will harm their business model, starting for example with advertises getting concerned about brand safety. And maybe ending with people getting reliable information elsewhere.

At the same time, Facebook wants to be impartial and avoid being accused of favoring one news organization over the other. With this survey, Facebook places the question of how to establish trust and trustworthiness of news firmly in the community’s hands. So that it can adjust the algorithm accordingly. And forgets that — since we are on social media — with this set of questions the survey might turn into a popularity contest after all.

--

--

Benjamin Barteder
Content Mines

digital marketing guy | content strategy student | GIF freak