Crowdsourcing open-source investigations with Checkdesk and Silk
Bellingcat’s Eliot Higgins explains how new platforms can help with the verification and presentation of open-source information


Find more reads and resources on newsgathering, verification and eyewitness media at FirstDraftNews.com
While much work with open source and social media investigation is focused on the verification of claims, new platforms have become available to aid both in the verification and presentation of information collected.
By combining these platforms it’s possible to crowdsource the verification of sources, gather verified information and present it in a way that keeps it both open and secure, ensuring transparency throughout the verification process.
Checkdesk is a platform developed by Meedan designed around collaborative verification. Checkdesk has a blog style interface, which allows users to submit “reports”, generally links to content that needs verification. It is then possible to turn on verification for each report, allowing users to comment, and reports can then be added to stories as updates.
The below example is from the Bellingcat Checkdesk story Identifying MH17 parts: nose & flight deck (Section 41), where collaborators examined wreckage from Flight MH17 in an attempt to establish which part of the aircraft they belonged to:


Once each report has been discussed the status of the report can be set to a number of options, including verified, undetermined, and false. If new information comes to light it is still possible to continue the discussion, but it’s up to the moderators to decide when a discussion had reached a conclusion. Reports can also be embedded into most websites, allowing them to be easily shared outside Checkdesk.
In some instances there is value in collecting the verified information from Checkdesk, and presenting it in different forms. One platform we use at Bellingcat is Silk, which allows for the creation of online databases, and the visualisation of data from those databases. What is particularly useful about Silk is how easy it is to great visualisations from the data.
For example, Bellingcat created the Ukraine Conflict Vehicle Tracking Project, which took verified images of military vehicles in Ukraine and Russia from Checkdesk to populate a Silk database.
Each record included an image of the vehicle, the claimed location, the geolocated co-ordinates, the date of the report, details of any markings visible, and others. It was then possible to embed information from the Silk database into websites, for example this simple list of sightings from Luhansk, Ukraine.
Table of Sighting datacards with Ukraine, Luhanska Oblast, Luhansk as Reference Location, as Month and as Equipment…bellingcat-vehicles.silk.co
Data can also be displayed and shared in other ways. The following chart shows the different types of vehicles sighted in February 2015.
Chart of Number of Sighting datacards with 2015 as Month grouped by Equipment Categorybellingcat-vehicles.silk.co
All information in the embeds is pulled from the live data in the database, and it’s possible to change which data is displayed and access the database through the embedded chart.
In the above example, clicking on the Strela-10 area of the chart will take you to the Silk page, Sighting datacards with 2015/02 February as Month and Strela-10 as Equipment Category, which contains all the datacards the chart is drawing information from. Users can also create their own visualisations from the database and embed them in their own websites, without altering any of the underlying data.
In the case of the Ukraine Conflict Vehicle Tracking Project it was possible to make some interesting discoveries over time, even if the individual videos were often not that revealing. Once the database had been populated with a significant number of entries we reviewed videos from Ukraine and Russia that showed the same types of vehicles. In some examples it was possible to find identical vehicles, with the same unique features, in videos from Russia and Ukraine, for example an Msta-S self-propelled 152mm howitzer filmed in both countries.
In one case it revealed something that would have almost certainly been missed otherwise. As part of Bellingcat’s research into the downing of Flight MH17 we had identified the Buk missile launcher filmed and photographed in separatist-controlled territory in footage of a convoy travelling through Russia between June 23rd and June 25th. As part of our research we had put each sighting into the Ukraine Conflict Vehicle Tracking Project database, including the markings on vehicles, and all number plates that were visible.
On June 2nd 2015 Russian Buk missile manufacturer Almaz-Antey made a number of claims about Flight MH17, including the claim that Russia didn’t use 9M38M1 missiles, the type identified as being used to shoot down MH17.
The Bellingcat team quickly found a pair of Reuters photographs taken in the Summer of 2014 showing trucks transporting missile crates with 9M38M1 written on them in Russia, close to the border with Ukraine. It was also possible to read the number plates in the Reuters photographs, and they were searched for in the Ukraine Conflict Vehicle Tracking Project database.
Results showed the same transport vehicles were part of the convoy that transported the Buk linked to the downing of Flight MH17 through Russia between June 23rd and June 25th, showing that not only did Russia have the missiles, but a link between those missiles and the MH17 Buk.
In a more recent example Bellingcat used Checkdesk to examine over 60 YouTube videos from the Russian Ministry of Defence, of their airstrikes in Syria, verifying the location of the airstrikes, and whether or not ISIS was known to be present in the area bombed.
It was possible to show that 60 per cent of the videos were either in a different location than claimed, or wasn’t targeting the group claimed (or both), and to collect all this data in a Silk database
Chart of Number of Airstrikes datacards grouped by Statusrussia-strikes-syria.silk.co
By combining Checkdesk and Silk it’s possible to have a transparent process that allows data to be verified and shared with anyone who cares to look, and there’s many possible applications for this combination of tools in future open source investigations.
Eliot Higgins is an expert in open source investigative techniques and founder ofBellingcat
Follow First Draft here and on Twitter for more case studies and advice on opensource investigation, plus how-to guides on verifying social content