What we’re reading: Solutions Edition
A look at research, proposals & tools to combat misinformation
Following some bad press this week, Elon Musk blew up parts of the Twitterverse by declaring he may start a crowd-sourced service to determine news bias and credibility. (Not a new idea, as Alexios Mantzarlis, director of the International Fact Checking Network quickly pointed out, and one that done sloppily can do real harm.)
When a Silicon Valley celebrity like Elon Musk brings up the issue of news trustworthiness, it’s a good time to review progress by a sample of Knight-supported projects and research already underway. These projects are informing the work of the Knight Commission on Trust, Media and Democracy, as commissioners develop recommendations for release in early 2019.
Bringing credibility to credibility
An alternative to throwing credibility to the crowd: convene a community to propose standards and then test rigorously. One effort working systemically working toward creating such standards is the Credibility Coalition, a community of researchers seeking to create shareable, open-source definitions that could be widely adopted. The coalition is also dedicated to reviewable research on how this process will work in practice. In April 2018, the coalition published a paper, providing a proof-of-concept– a dataset of 40 articles annotated with credibility indicators and proposals on next steps for study and research.
Bot or not
Researchers at the Indiana University Observatory on Social Media released upgrades to two tools, Hoaxy and Botometer, that help detect misinformation online. Hoaxy now has new functions that show users not just which stories are trending on Twitter, but also which Twitter accounts spreading those stories might be bots.
When to watch and wait, when to report
Similar to medical doctors warning that treating some early-detected cancer may be more harmful than watchful waiting, in a deeply researched paper, “The Oxygen of Amplification: Better Practices for Reporting on Extremists, Antagonists, and Manipulators Online,” for Data & Society, Whitney Phillips argues that reporters should use caution when choosing to report on the extremist movements. She provides a concrete list of criteria for journalists to consider when deciding whether or not to report on extremist activity online, and how, noting that not reporting on harmful information also brings risks. She also speaks to the pressures today’s reporters face, lacking institutional support and under duress to churn out product.
First developed by PBS for internal use, NewsTracker is a tool that identifies Facebook pages that traffic in misinformation and tracks how often the content there is liked, shared, and on commented on. Reporters use this tool to find patterns and trends that may merit reporting. The tool will have a new home soon: the Shorenstein Center at Harvard’s Kennedy School of Government, where it can gain wider testing and use.