Media Literacy: Crowdsourced Fact-Checking

YS Chng
4 min readDec 18, 2020

--

Photo by Stefan Lehner on Unsplash

2 years ago, I wrote about how the wisdom of crowd can be useful in determining true values.

In the case of estimating the weight of an ox, the wisdom of crowd works through a mechanism known as bracketing, where estimates fall on both sides of the true value, such that the deviations cancel out one another and the resulting average has a small absolute deviation from the true value.

Averaging removes random errors from over- and underestimations.

This appears to work for movie reviews as well, where the average rating coming from a general audience is often a better representation of the movie’s receptivity, as compared to ratings coming from movie critics who may have biases that are not shared by the average movie-goer.

The question is, will this same crowdsourcing mechanism work for fact-checking?

MIT Research

To answer the question above, some good news has actually come out of a working paper from researchers at MIT quite recently (Oct 2020).

In their study, the researchers compared the fact-checking accuracy of two different groups:

  1. 3 professional fact-checkers, who researched the entire articles to make their verdicts
  2. 1,128 Americans from Amazon MTurk, who only read the articles’ headlines and lede sentence

Using a set of news articles that had been flagged by Facebook’s internal algorithm, the study found that the average rating of a politically-balanced crowd of 10 laypeople seems to perform as well as the individual judgments of the 3 professional fact-checkers.

https://twitter.com/DG_Rand/status/1314212784536604676

The researchers were careful to caveat that while crowds of laypeople seem to do well, individuals were still susceptible to misinformation. But their point is that if crowdsourced fact-checking is good enough, it can be a lot more sustainable than professional fact-checking which is hard to scale.

What’s also key to the observed results is that the crowd of laypeople is “politically-balanced”. As seen in the case of the movie critics, professional fact-checkers may still be accused of having a bias, and that will not inspire confidence in their credibility. As the researchers aptly put,

“truth” is often not a simple black and white classification problem.

VeriFact SG

Based on the same philosophy, Singapore non-profit “tech for good” collective better.sg has been working on developing a crowdsourced fact-checking platform called VeriFact SG, to allow users to query rumours and potential misinformation, which will then be addressed by other users who must substantiate their responses with evidence.

Responding to questions with substantiation.

To streamline the fact-checking process, questions will be structured in a format that allows a straightforward answer of “True”, “False” or “Neither”. But besides answering the question, responses will also need to include a short explanation and a citation as evidence, to ensure that the user has done proper fact-checking before submitting a response.

Acting as a second layer of fact-checking, the submitted response is also subjected to a poll by other users on whether the evidence is “Credible” or “Not Credible” in addressing the query. Visitors to the page can then see the range of responses to a query and their perceived level of credibility.

Voting on the credibility of responses to questions.

These primary features aim to nudge users in providing better quality fact-checks, which will also encourage greater media literacy in the society. But beyond that, the better.sg team also plans to include more secondary features that target different types of audience, as well as to boost the social media shareability of the content from VeriFact SG.

Conclusion

As supported by the MIT research, the benefits to crowdsourced fact-checking are numerous:

  • Fact-checking that is crowdsourced is arguably the least biased.
  • Platforms that use crowdsourced fact-checking are technically not arbiters of truth.
  • Crowdsourced fact-checking can be scaled by harnessing the wisdom of the online crowd.
  • Users of the platform will be able to see a range of responses instead of just a single “truth”.
  • Using the crowdsourced fact-checking platform promotes greater media literacy in society.

If we’re talking about the long-term fight against misinformation, there is going to be a limit to what professional fact-checkers can do, even with the help of machine learning tools. In this day where the form of misinformation keeps evolving, the only certain way to keep people safe is by arming them with critical thinking skills, and crowdsourced fact-checking is a step in that direction.

If you would like to find out more about better.sg and the projects that they are working on to improve media literacy, feel free to check out their project Notion page, or take a look at this other article of theirs: https://better.sg/blog/2020/09/08/make-it-better-misinformation-in-the-media.

Disclaimer: The author of this article is a member of better.sg and its Media Literacy Group.

If you found this article interesting, you might also enjoy this one:

--

--

YS Chng

A curious learner sharing knowledge on science, social science and data science. (learncuriously.wordpress.com)