Oberbaum Bridge, Berlin. Photo by Jon Brock

Can blockchain put the trust back into science?

Frankl reports from the first international Blockchain for Science Conference in Berlin


It was around this time last year that I first encountered the concept of trustlessness, the idea that blockchain allows you to interact directly with people you don’t necessarily trust.

I thought immediately of Nulllius in verba, the motto of the Royal Society and a guiding principle of science. It means literally “on the word of nobody”. We trust science because we don’t have to trust individual scientists. Instead, we trust their data. We trust the evidence. And we trust the scientific community as a whole to sift the good ideas from the bad.

Nullius in verba. Photo by Duncan Hull on Flickr

That at least is the theory. The evidence increasingly is that many published research findings cannot be trusted. When independent researchers attempt to replicate the findings of other scientists, they often get quite different results. There’s a lack of openness and transparency in conducting and reporting science. Too much is left to trust.

It transpired, of course, that I wasn’t the first to make this connection. Around the world, many other scientists, technologists, and (the under-appreciated heroes of science) librarians have been thinking about and exploring the blockchain-science nexus. But it’s really only in this last year that critical mass has been reached.

And so earlier this month in Berlin, Sönke Bartling — radiologist, scientist, and “serendipitist” — welcomed over 120 delegates to the first international Blockchain for Science Conference.

Trust in the algorithm

Computer scientist Ali Sunyaev made the case for blockchain, explaining how it can remove the need for trust between individuals.

“The core idea is that algorithms rule. That’s why we’re speaking of zero trust. You don’t need any persons or organizations. You trust in the algorithm. You trust in maths and cryptography.”

He acknowledged, however, that blockchain is not a silver bullet for open science.

There are inevitable trade-offs between performance and security and between anonymity and flexibility. Before pursuing blockchain as a solution it’s important to consider whether the properties of blockchain address the problem you’re attempting to solve.

Original source material

One of the key features of blockchain is that it makes all “transactions” immutable — things written to blockchain can’t be deleted.

Origin Stamp, presented by Bela Gipp, takes advantage of this property, providing an immutable timestamp of any digital object. It does this by writing the hash or “digital fingerprint” of the object to a timestamped block of data. You can then prove that the object existed at that time and that it hasn’t changed since.

The real power of blockchain comes from combining hashing with public key cryptography. Each blockchain address has a private key and a corresponding public key. When you write something to the blockchain, you can “sign” it using your private key and anyone can verify it using your public key.

The conference threw up many applications of this technology. Lambert Heller noted it is already being used by universities to verify academic qualifications.

Joris van Rossum and Alex Tran Qui described the Blockchain for Peer Review initiative which uses blockchain records to make scientific peer review more transparent.

Similarly, the ARTiFACTS platform, presented by Dave Kochalko, provides a blockchain platform for researchers to gain credit for the contributions they make to science beyond publications in scientific journals. The idea is that immutable timestamping encourages researchers to share their “artifacts” sooner, safe in the knowledge that others cannot claim priority.

Creating a data supply chain

At Frankl, we’re taking a similar approach to scientific data. Applications that collect scientific data have inbuilt data management software that makes it easy for researchers to share their data with collaborators and the wider community.

The applications also write the hash of the data on the Ethereum blockchain, allowing researchers to prove the integrity and provenance of their data. We think of it as the beginning of a “supply chain” — from raw data to published paper.

Other groups are thinking along similar lines. Dimitri DeJonghe described Ocean Protocol which aims to create a supply chain for scientific data.

A crucial feature is that analysis protocols — controlled by smart contracts — can go to the data rather than the data being released. For large datasets this is more efficient than uploading and then downloading the data. And it means in principle that data “owners” maintain control of their data and can be rewarded for sharing it.

In De Jonghe’s words:

We create time-stamping processes in the middle that say that actually this person delivered data; this entity delivered algorithm; this compute did its job, it combined the data and the algorithm, stored the result somewhere else. All these actions get recorded as a provenance tree to a blockchain.

Later in the conference, Denis Grishin introduced Nebula Genomics which works on similar principles but focuses on genetic data.

A decentralized data marketplace?

Bioinformatician Konrad Förstner spoke of his excitement at this concept of blockchain-based decentralized data marketplaces. The noble ideals of open science and open data are, he noted, often in tension with the privacy of research participants. Giving them control of their data and rewarding their participation could be a game-changer for biomedical research.

But he also sounded several important notes of caution. The projects are all at an early stage of development and we need to be sure that the systems really are secure and working as intended before they’re applied to real data from real humans. There’s also an ethical question of incentivising people to share data without first educating them about the possible implications.

A technological bluff?

Förstner wasn’t alone in highlighting the challenges ahead for the blockchainification of science. Alexandra Giannopoulou spoke of the difficulties reconciling blockchain with European GDPR legislation which was designed for a world of centrally controlled data.

Nina Siedler outlined the problematic legal status of blockchain-enabled DAOs (Distributed Autonomous Organizations)...

…while Jason Barkeloo, speaking via video conference, told of the onerous process his project Knowbella has undertaken to become a credentialed security token in the United States.

Stroke researcher Ulrich Dirnagl expressed more fundamental concerns. He argued that blockchain is nothing more than a “technological bluff” that risks detracting from other efforts to improve the culture of science.

His concerns were echoed by palaeontologist Jon Tennant who also emphasised the need for cultural change.

Catalysing cultural change

A rebuttal to this criticism is that technological changes often do lead to cultural change — the smartphones from which we were all tweeting being just the latest example.

Aleksandra Sokolowska of Validity Labs argued that in fact the properties of blockchain are precisely aligned with the idealised social norms of science.

And many of the projects presented at Blockchain for Science were aimed squarely at shifting scientific culture.

A key theme of the meeting, articulated by Akasha’s Martin Etzrodt, was the need to decentralize science.

DaMaHub’s Denis Parfenov talked about decentralized data storage. He argued that in today’s centralized systems, each data repository represents a single point of failure and control.

Roman Gonitel described Network of Knowledge which seeks to short-circuit the hierarchical information flows in science and instead allow peer to peer interaction between scientists.

Alex Shkor described the DEIP platform which aims to achieve decentralization through tokenising the different parts of science.

Philip Sandner described the potential advantages and challenges of tokenised research funding.

Paul Kohlhaas, founder of Molecule, introduced the concept of token bonding curves and how they could be applied to incentivise and accelerate the development of new pharmaceuticals.

Lawrence Rajendran spoke at length of the culture in academic publishing to only tell complete “stories”. He described the Science Matters platform which allows researchers to simply report interesting observations and he introduced the Eureka Token which aims to incentivise replications of these observations.

Massimiliano Picone provided a glimpse of the future with roboticised replication experiments controlled by smart contracts.

Finally, Yalda Mousavinia presented Space Decentral, a decentralized agency for citizen-powered space exploration.

A Cambrian explosion

Blockchain for Science was notable for the creativity and diversity of ideas.

As computational chemist Karmen Condic-Jurkic argued, the challenge for all these projects will be to ensure that the “cool tools” are built in consultation with the scientific community to ensure that they are actually used.

Mass adoption, added mathematician Ismail Khoffi, means that user experience has to be prioritised.

Not all of the ideas will prove viable. There may in some cases be better, non-blockchain solutions to the problems at hand. But as Lambert Heller argued, the way forward is to learn by doing — applying open science standards to blockchain development, experimenting openly and in public.

Blockchain may be a buzzword right now. But in truth, Jim Nasr argued, it’s just another bit of tech. Long-term success, he added, is when blockchain becomes invisible and ubiquitous. When researchers are using blockchain without even thinking about it.


At Frankl, our mission is to make open science easy and rewarding for scientists. If you’d like to know more, you can read our whitepaper, check out our website, follow us on Medium, Facebook and Twitter, or join our Telegram channel.