Image for post
Image for post

Facebook Can Address Fake News and Raise Media Literacy

Many Americans get their news through Facebook, some of it is fake, and it has influenced the outcome of the Presidential election. Fake is more popular than real. The motive of fake news is disinformation or profit, and realistically both.

Much has been written about how this happened and what it means. Real news will write about this issue to no end, and the issue is a real threat to Facebook — not just our democracy. Facebook did take a laudable step to ban ad revenue on Fake News. But that doesn’t solve the problem.

The 1% can’t minimize this down to 1%. Source: Buzzfeed

The moment Facebook filtered the newsfeed, it was doomed to fail. There will be filter failure. Regardless of if algorithmic or human edited. Some approaches scale better than others and scale matters as there is no limit to potential news sources and the audience is already almost everyone. False positives that filter out the wrong stories limit our freedoms. Bias isn’t the problem, it’s the lack of disclosure behind sources and stories.

When the social web was new, blogging offered publishing by anyone as alternative news. In blogging, the story shifted from what went through an editorial process to get the point of published, to after it was published. The story was a web of posts and links. Facts could be corrected by anyone in comments, posts, backtracks and everything that is now a spam hole or worse. And if you had some social media literacy you could grasp the story as true as it can be.

Today media is fragmented, platforms centralized, bots are rampant and confirmation bias is a business model. But part of the solution needs to work the way the internet works.

Facebook should continue to let people share news that could be fake, but it should proactively have the product inform people about the risk of fake — and raise media literacy.

Image for post
Image for post
If a dubious story, serve a media literacy widget as the first related story

One potential solution is to create a widget the size of a People Also Shared story or Suggested Videos to the one in question. At a glance, a person should be able to tell if the story is dubious or not. And if clicked through learn about the source and context of the story.

This is a disclosure statement, similar to how Facebook discloses what it knows about you and shares that data with advertisers. Facebook already has a mechanism on hand to filter out Fake News and they could repurpose this logic in a way that is hard to game and easy to understand so people can be informed.

— 30 —


Photo credit

Written by

CEO & Co-founder of Pingpad. Previously LinkedIn, SlideShare, Socialtext, RateXchange. Husband of Leila Al-Shamari, and Father.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store