Facebook Needs A Trust Button For News

This week, Buzzfeed reported on the quality of news being shared on Facebook. It is a great read, and you should run over to Buzzfeed.com and give it a look if you haven’t already.

One of the biggest things that came to light was the engagement rate of stories, that have no factual content.

https://www.buzzfeed.com/craigsilverman/partisan-fb-pages-analysis?utm_term=.igw5yjvGGd#.ajmxBGDOOp

While this is troubling in it’s own right, it is even more worrisome when you realize that nearly 50% of online adults consume news through the Facebook newsfeed.

Given these two pieces of data, it is likely that a significant portion of Americans are seeing fake stories, framed as news.

Within Facebook, engagement drives reach. Meaning even now, if a Facebook user were comment on the post saying the story was false, this act of engagement would continue to fuel the reach of the offending post through the Facebook ecosystem.

The Facebook newsfeed is only as good as the stories being shared in it. Right now, the quality of many “news” posts are suspect. So, how does Facebook fix this?

I am wary of Facebook’s human editors introducing bias and/or algorithms misidentifying stories. Which is why I present to you the…

Trust Button

Facebook has already introduced a number of “reaction” buttons, why not include a button that communicates trust? It would allow users of the platform to endorse the accuracy and reporting of a post, while preventing users from accidentally aiding in the spread of fake news by reading it.

What do you think? Good idea? Bad idea? Comment and let me know.

Show your support

Clapping shows how much you appreciated Matt Karolian’s story.