Crypto rebuts the conventional wisdom about technological progress

Daniil Gorbatenko
Door to Crypto
Published in
5 min readJun 15, 2019

One of the few beliefs both most scientists/experts and non-scientists/experts probably share is the idea that technological progress takes place in a simple sequence. First, government and sometimes also non-profit institutions set the priority avenues for scientific research and allocate funding to them. Scientists then use that funding to make fundamental breakthroughs that are then used by engineers to make prototypes. Finally, entrepreneurs bring funding and acumen needed to commercialize the resulting technology and make its fruits available to consumers.

The rise of the Internet inspired the myth

The rise of the Internet is supposed to be the poster child for this narrative. As the story goes, the U.S. government, first, funded the creation of Arpanet, then other governments funded the work of Tim Berners-Lee who created HTTP, HTML and URLs. Then a lot of tinkering happened before entrepreneurs like Zuckerberg and Bezos could finally unleash its power in full.

Already, even this story is at best incomplete and, quite possibly, very misleading. First, the idea that led to Arpanet was first formulated by J. C. R. Licklider in 1963, and until 1962, he had not been a government employee but worked instead, first, at MIT and then at Bolt, Beranek and Newman. It is in 1962 that Licklider first proposed the vision of the Intergalactic Computer Network.

It influenced his successors at ARPA who later created the ARPAnet but the latter was not a top-down project but merely an experiment by ARPA scientists. And the rationale behind it did not remotely resemble the modern Internet but rather concerned enabling better time-sharing of computing resources between various research institutions. ARPAnet did lead to the development of packet switching technology but it was more of a byproduct of the trial and error process of attempting to make ARPAnet work. Similarly, Tim Berners-Lee set the foundations of the World Wide Web as a contractor for CERN while trying to facilitating and sharing information among researchers. Private businesses like AOL then used those protocols for interoperability because they were already available.

Hence, it was not government funding of fundamental research that led to the creation of Internet protocols but rather a set of complex interactions between computing experts trying to solve other (practical) problems, even if most of them were government employees or contractors.

How blockchain tech further upends the conventional wisdom

The emergence and rapid evolution of blockchain technology, however, provide an unambiguous demonstration that the predominant narrative about technological progress is deeply flawed. In a video recently published by Coindesk, computer scientists involved in research related to blockchains marveled at how quickly the interplay between basic science and blockchain practice has been happening in the crypto space.

Interestingly enough, the very vibrant, cutting-edge domain of research and development that they work in would not have existed had it not been for a paper published pseudonymously on an online forum. The paper would probably have never been noticed by the computer science and cryptography community if it had not been used by enthusiasts to develop the first blockchain platform Bitcoin and its native cryptocurrency.

Before the individual or group under the pseudonym Satoshi Nakamoto produced the groundbreaking Bitcoin paper in the late 2008, a lot of scientific research had, of course, been done in computer science on distributed systems. One of the key directions in that research was aimed at developing consensus algorithms for such systems, and its inspiration came from the famous 1982 paper on the Byzantine Generals Problem. In 2002, Castro and Liskov proposed the highly influential Practical Byzantine Fault Tolerant (PBFT) consensus algorithm.

However, until Satoshi’s breakthrough no approach to consensus had been developed that would be suitable in practice for distributed systems like blockchain networks. This is not to say that Satoshi’s solution involving proof of work plus the longest-chain (highest accumulated difficulty) rule is the last word on the subject. Rather, what it did was inspire scientists and technologists to try harder. Since 2008, they delivered and continue delivering a flood of new proposed solutions from revamped and turbocharged pre-Nakamoto approaches pursued by projects like Hashgraph, Algorand and DFinity to radically new solutions like Casper, Avalanche or Pyrofex’s Casanova.

Similar complex interplay between basic science and practical development has been happening with regard to privacy (especially, the so-called zero-knowledge proofs, or ZKPs), account management (for instance, secure multi-party computation to avoid the hassle with private keys), Sybil resistance (the development of proof of stake), second-layer technologies (Lightning, Plasma), smart contract verification and other aspects. And these research specializations are not isolated silos, they can and do become interconnected in various ways. For instance, ZKPs can also be used for secure multiparty computation which can also be used for consensus. Boneh–Lynn–Shacham (BLS) signature scheme was as its name suggests initially created for digital signatures but it is used by Dfinity for its consensus mechanism. Undoubtedly, there is a lot of work cut out for the future historians of science and technology who will wish to study the fascinating evolution of blockchain space together with parts of the relevant scientific disciplines.

However, even some current participants in the space recognize the nature of the process, which is nice to see. In the Coindesk video mentioned above, Aviv Zohar, one of the leading scientists specializing in ZKPs, gave an insightful summary how this happens in practice:

Zero-knowledge proofs, for example, immediately lead to the creation of anonymous cryptocurrencies, new forms of systems that we haven’t seen before, and they really drive cryptographers to look into more of these things. So, this is really an area where the Bitcoin community which is comprised of developers is really close to the academic community, feeds off of it and feeds it with new interesting problems.

Basic science is needed, too

None of this is to say that basic research in computer science and cryptography is unimportant for the blockchain space. Attempts to ignore it lead to embarrassing episodes such as IOTA’s use of its own badly flawed hash function. Adopting new tools too fast may also pose problems because rigorous research and testing may discover that they are flawed. There has also been quite some bickering involved between academic computer scientists and cryptographers like Emin Gun Sirer and practitioners like some Bitcoin Core developers. One could even sometimes get the impression that each side believes that they could do perfectly well without the other.

The reality is, rather, that the boundary between basic research and its application may be less clear-cut than it may seem and that both advance best where there is vigorous interplay between them. Most importantly, no one really has to be in charge of the whole process or even the initial stage of it, like the blockchain space, scientific and technological progress is fundamentally decentralized. With all the messy details, blind alleys, bickering, duplication of efforts that this undoubtedly creates.

The article was originally published by Proof of News.

--

--

Daniil Gorbatenko
Door to Crypto

PhD, economics (2018) from Aix-Marseille University, independent blockchain adoption consultant based in Aix-en-Provence, France, Email: daniilgor2004@gmail.com