“Truthrank” can’t be transparent enough for this job.
44% of Americans get news from Facebook. They aren’t going to be satisfied with a proprietary algorithm deciding what is misinformation. Facebook’s best intentions won’t save them, even if they listen, as Zuckerberg has promised they will.
There’s no way they can satisfy everyone. The best they can do is agree on some rules and share them. When stories are found to break them, the reasoning should be made public, too.
This requires humans. And those humans can’t work for Facebook.
I’m thinking it’s a non-profit, call it Truth.org. Facebook would send them the top suspicious stories, along with user reports of falsehood, etc. Truth.org evaluates, then responds with a letter grade and a link to the reasoning.
Truth.org needs a board and staff reflecting diverse political perspectives, obviously. And they’ll need good technologists, too. The back and forth with Facebook (and other platforms) should be done via an API. Online volunteers will be needed to help the professional fact checkers scale (flagging duplicate versions of a false story, for instance).