Expert’s Corner with NewsGuard’s Veena McCoole & Sarah Brandt

Two experts from NewsGuard explain their organization’s hands-on approach to dealing with misinformation

Checkstep
Checkpoint

--

For this month’s Expert’s Corner, we get to hear from two senior members of the venerable NewsGuard internet trust tool. Veena McCoole and Sarah Brandt both manage and facilitate partnerships at NewsGuard, an organization that provides online safety for brands, individual readers and indeed even democracies, eschewing algorithms in favor of the human expertise of trained journalists. NewsGuard publishes reports on misinformation generally as well as detailed reviews of individual news sites evaluating them according to various criteria with an emphasis on transparency of the organization and the credibility of the information they publish.

Headshot photos of Veena McCoole and Sarah Brandt
Veena McCoole, left, is the Vice President of Strategic Partnerships in the UK and Sarah Brandt is the Executive Vice President, Partnerships at NewsGuard, an internet trust tool known for its fact checking and news site ratings.

Veena is the Vice President of Strategic Partnerships managing UK partnerships with technology platforms, advertisers, researchers, schools and libraries. Sarah is the Executive Vice President, Partnerships for NewsGuard, based in Austin, Texas. She manages NewsGuard’s global partnerships with schools, libraries, researchers, technology platforms, ad tech providers, and security companies. Prior to joining NewsGuard, Sarah worked as an Associate Consultant for Bain & Company.

1. What was the motivation behind NewsGuard? While researchers often employ fact checks to best deal with misinformation, are they truly effective?

NewsGuard was founded in 2018 to combat misinformation and restore trust in the media, with the belief that journalism — instead of algorithms — can effectively ensure online safety for users, brands, and democracies. By deploying trained journalists to manually evaluate the credibility and transparency practices of news and information websites, our licensing partners — including advertising companies, technology platforms, content moderators, reputation management software companies, and news aggregators — can leverage this human-vetted data to make informed decisions about which news websites to trust.

Fact checks address misinformation by setting the record straight after a falsehood is spread, but this approach has its limitations. By definition, fact checks happen after the fact — after erroneous information has circulated and caused harm. Moreover, fact checks infrequently reach the same people who saw the original misstatement. Recognizing the shortcomings of debunking, NewsGuard designed its approach to be more proactive — pre-bunking a falsehood by providing people with instant information about the lack of reliability of the source that published it. Indeed, scientific research supports the effectiveness of NewsGuard’s intervention as a pre-bunking solution.

For example, in 2017, Indiana University researchers Alan Dennis and Antino Kim ran an investigation comparing different reputation rating formats to assess their ability to influence users’ belief in news articles. They found that presenting source reputation ratings directly influences the extent to which users believe articles on social media. In a 2019 article in The Conversation, Dennis and Kim said: “What we learned indicates that expert ratings provided by companies like NewsGuard are likely more effective at reducing the spread of propaganda and disinformation than having users rate the reliability and accuracy of news sources themselves.”

Separately, in their article “Neutralizing misinformation through inoculation: Exposing misleading argumentation techniques reduces their influence,” published in May 2017 in PLOS One, Stephan Lewandowsky, a psychologist at the University of Bristol, John Cook, a researcher at the Center for Climate Change Communication at George Mason University, and Ullrich Ecker, a cognitive psychologist at the University of Western Australia, explain how warnings about misinformation are more effective when they are administered before misinformation is encountered rather than after. This is because pre-existing beliefs impact how people respond to novel information. NewsGuard’s intervention of providing source-level evaluations when someone first encounters content can be described as “pre-bunking,” supporting the findings of Lewandowsky et. al’s study.

2. From the Covid Infodemic to Russian propaganda do you find any specific patterns in propagation of misinformation?

Our editorial team of trained journalists and misinformation specialists keep abreast of the global news and information landscape on a daily basis, and they have noticed that many misinformation publishers are repeat offenders. In other words, those perpetuating false hoaxes about COVID-19 vaccines, for example, are also likely to publish misinformation of other kinds, including false content about the Russian war in Ukraine. Similarly, websites that purported to publish “health advice” with misleading, scientifically-unproven remedies were some of the leading publishers of false COVID-19 information at the outbreak of the pandemic.

By evaluating these websites at the source level and tracking their editorial practices over time, NewsGuard offers a valuable tool for users, companies and democracies alike as our data proactively identifies repeat offenders and purveyors of misinformation ahead of the next wave of misinformation. No longer do users and content moderators have to wait for fact-checkers to retroactively debunk false claims: they can leverage “pre-bunking” data that protects them from forthcoming threats before they even arise.

3. No one has been spared from the Information Wars, which came as an added “bonus” of the Russian Invasion of Ukraine. How should people navigate through such information, and how can they distinguish actual facts vs false news?

Navigating the landscape of misinformation, particularly related to Russia’s war in Ukraine, is made increasingly complex by the rapid cadence of new false claims appearing each day. NewsGuard’s editorial team has been documenting daily the headlines on Russia’s Channel One about the war and debunking false claims spread about the conflict using information from authoritative sources.

By using tools like NewsGuard’s browser extension, internet users can see source context information on search results and on social media to aid them as they consider which sources might be trustworthy and which ones fail to adhere to basic journalistic standards. By consuming news from authoritative sources with high NewsGuard scores, users can help ensure that their news diet contains transparent, ethical journalism — rather than harmful misinformation and false claims.

4. Currently there are no particular regulations addressing misinformation, do you think associating fines to dissemination of misinformation might have an impact on limiting its spread?

At NewsGuard, we are proponents of user empowerment and believe that the greatest solution to combating misinformation is putting tools, context, and information in the hands of users, enabling them to make independent, informed decisions. When platforms block or censor content it stifles freedom of expression, but equally, they must be held accountable and implement effective trust and safety measures that empower and protect users who interact with and contribute to their platforms.

If you would like more information or are exploring options for AI enhanced moderation for your platform, contact us at contact@checkstep.com. Alternatively, you can also visit our website www.checkstep.com.

--

--

Checkstep
Checkpoint

AI to boost good content 🚀 and moderate bad content ⚠️ — Manage a UGC platform? Say contact@checkstep.com