Checking the Vote: How Check Was Used During Electionland
written by An Xiao Mina, Director of Product
When First Draft News (of which Meedan is a founding member) and ProPublica started planning Electionland, they turned to Check, Meedan’s platform for collaborative verification of digital media. Electionland was a collaborative project held during the US elections to look at and report on voting access across the country on Election Day and the days leading up to it.
We knew it was going to be tough. We were expecting hundreds of professional journalists, journalism professors and students to be using Check on Election Day. Social media would be abuzz with users. Although the challenge for Check to meet the needs of Electionland would be great, we also realized the opportunity for learning.
At Meedan, we follow a user-driven methodology with the software we build. We believe strongly that users offer the best insight into how we should think about the products we’re building. What better way to follow Meedan’s user-driven methodology than with some of the world’s leading experts on journalism, verification and the elections?
How it worked — and what we learned
A few days after Electionland, we pulled numbers on how Check was used on November 8. Stats are only part of the story, but they’re a useful way to tell the story. Let’s look at some numbers, and then tell a story about how Check operates. Here’s what we learned:
Over 1200 reports were added to Check.
On Check, we have the concept of a report, which could be a link to social media, WhatsApp posts, Twitter DMs, or notes from Landslide, a tool built by WNYC to aggregate information received by text message, online forms and other formats.
Feeders — people searching social media or taking tips from other channels — add these items to Check after a certain threshold of research has been done. They try to verify the name, location, and other elements of veracity before adding something to Check.
Of these reports, about 40% were authenticated.
On Check, each report has a verification status attached to it. A report can be marked as “False”, “Authenticated” or “Inconclusive”. It has to meet a certain bar, ensuring that due diligence has been achieved. Before that, it status is simply marked as “In Progress” while feeders work together to authenticate items.
As you might imagine, this requires time and effort. But how much?
Authentication happened, on average, at just under an hour.
That means that, between adding an item to Check and then changing its status to “Authenticated”, feeders needed about an hour. That hour might have been spent searching corroborating social media, calling the person in question for more details, analyzing the content of a video, etc. That’s the average, but it varied substantially depending on the complexity of the report, from a few minutes to a couple hours.
This is a key metric for Check: time to completion. In other words, how quickly does Check facilitate the investigative process? This metric matters in an era of breaking news on social media, where minutes count, but so does veracity.
One way we facilitate speedy work on Check is with Slack integration. For nearly every action taken on Check, whether that’s adding a report, adding a note or changing a verification status, our friendly Check Bot automatically pings Slack. As this is where the Electionland team is having their live conversations, the Check Bot can help them keep up with what’s happening on Check in near real time.
We’ll be watching this number closely as Check is used more broadly by news organizations.
Authentication required, on average, 3–4 notes.
Check is a collaborative platform, and it’s also an iterative one. We design our tool so that multiple people can contribute notes, and iteratively gather supporting facts. This makes Check a useful “tip sheet” for journalists: it contains a verification note, the original report, and then corroborating notes and evidence gathered by the feeder.
One motto we have for Check is “show your work.” We think of notes as a log of activities and actions, a useful way to trace the work done by the individuals doing the research. The decision to hand a Check report to a journalist is often decided by a catcher — a professional journalist who reviews the notes added by the feeder.
Some 300 stories have appeared in local or national news outlets, or on Electionland’s own site.
That number is still being tallied, but that’s the latest from ProPublica’s conversation with Nieman Journalism Lab. This points to another key metric for Check: stories generated. In many ways, this is the most important metric. Check might be fast, but it needs to be fast at helping reporters tell better and more accurate stories. And if they can do that in a collaborative way, that’s even better.
Right now, this is a correlation between what’s happened on Check and the number of stories related to Electionland. We’re diving into this more deeply over the next few weeks, and we can more confidently share the relationship between reports added to Check and stories that later emerge from the conversation.
What’s next for us
We hit a number of roadbumps, both usability-wise and technically, and we took a ton of notes. Our goal, as always, is to learn from users and regularly improve on our products.
On Election Day, we had five people operating in shifts to help with technical support, and we saw this as yet another opportunity to learn directly from our users. Each support staffer took notes on the issues they were seeing, the solutions they found in the moment, and possible product solutions for the future.
Check is still in beta, but it’s being used right now by a number of partners, including Amnesty International, the University of Hong Kong, Bellingcat and, soon, the First Draft Partner Network. If you’re interested in becoming an early beta partner, please reach out.
To find out more about Check, sign up here or send us an email at email@example.com.