Man jumping without a parachute to check whether he can fly (one of the two assertions is a lie)

Raison d’être — Blockchain protocols are (heavily funded) unscientific research experiments

The first step in solving a problem is recognizing there is one

Published in
7 min readDec 31, 2018

--

The rise of protocols

In the past few years, following the realization that bitcoin indeed posed a credible threat to financial institutions, several projects sprung up to disintermediate traditional behemoths in different domains. While many projects were no different from their centralized counterparts carrying out an ICO for the sake of raising funds to promote a closed ecosystem with payment tokens (I’ll be ignoring outright cash grab scams throughout the publication), the projects that caught most ‘enlightened’ people’s attention were protocols.

Often compared to TCP these protocol tokens/projects were seen as an opportunity to capture value at the protocol layer instead of the application layer (ref. fat-protocol thesis). The proposition that open-source protocols could be monetized and use network effects as a moat was/is a reasonable theoretical argument. In the past few months, however, the protocols are faced with an existential crisis as they risk being disintermediated themselves.

Challenges facing protocols

  1. Poor adoption: This is a challenge faced by individuals rather than the collective protocol class. Poor adoption of most protocols makes betting on any particular protocol irrespective of however important a problem they are solving today a risky move given the possibility of their technology being improvised on by a competing and better team in the near future. Nevertheless, the lack of strong network effects today doesn’t invalidate the rationale behind the possibility of monetizing protocols at a future date.
  2. Disintermediation: Related to the point above but a threat to the very idea of monetized protocols is the constant fear of them being encapsulated by their largest user. Reasons include-
    2.1. Decreasing transaction fees: Illustrated by some ICOs on NEO, projects can initially leverage other projects to break into their network and once they gain enough momentum, they can fork the very project that brought them attention and hence decrease their gas costs while increasing the perceived and captured value of their own token. Network effects can be gamed.
    2.2. Bloated protocol thesis: As explained here, the difficulty of technically managing several protocol tokens within one’s own project’s smart contracts is probably a big reason why projects tend to do as much themselves instead of leveraging existing protocols that specialize in particular tasks.
    2.3. Interoperability limitations: May or may not be as prevalent right now, but it was pretty usual to see projects with similar goals and technology launching on different platforms to serve dapps/users on that platform. In such a case, betting on the right platform becomes important.
    2.4. Unproven technology: In the early days when most of the protocols are yet to stand the test of times or have anything to use upfront, it's tempting for teams to feel that the work already done by others is sufficiently small and that they are better positioned to continue working on the problem by forking existing projects or redoing some part of the work while benefiting from greater flexibility and customizability.
    2.5. Transitivity of (in)security: There are genuine technical challenges in allowing a protocol (user) to leverage another protocol (vendor) for crucial work. The miners of the vendor can be bribed by a competitor of the user to screw the user. Thus, especially when the user is larger than the vendor, it's crucial for the user to ensure that using the vendor doesn’t compromise the security of its own network.
  3. Security: A reason often used in support of blockchains and open source protocols are hacks of centralized databases. Ironically, they are probably more secure than several open source protocols simply because of the financial incentives and compliance requirements of centralized companies.
    Secure protocols are hard to design and implement. Traditional system designs often go through academic scrutiny of peer review. Unfortunately, in the blockchain space, as with open source projects in general, only the top protocols like Bitcoin and Ethereum get enough critical review from volunteers on the protocol design front. The other serious projects do pay professional firms for code audits, however, their core system design is often at the mercy of a small self-proclaimed expert team who more often than not internally recognize their work to be a potentially costly experiment (publicly clarified by their whitepaper disclaimers).

While the first challenge is specific to individual projects, the latter two have a wider impact. We thus focus our attention on them going forward.

Are specialized protocols the future?

Only time can tell whether or not protocols will be able to capture any value. Similarly, whether the future holds thousands of specialized protocols or one all-encompassing protocol is hard to predict. Bitcoin maximalists have always felt that Ethereum is an unnecessarily risky, insecure and expensive experiment with Bitcoin scripts being able to eventually make feasible anything Ethereum can in a secure thought-out way.

The possibility of protocols being able to encapsulate other protocols leads to inconclusive results. Just the way Apple kills its most popular apps on the App Store by incorporating them, blockchain platforms such as Ethereum can eventually integrate popular dapps (some platforms already come with built-in dexes). On the other hand, unlike mobile or desktop apps which can’t exist without the platform on which they are built, blockchain dapps can fork the platform themselves. A direction related to dapps proceeding to build on their own forked blockchain is where the most popular dapps fork the platform and convince miners to accept their own tokens as transaction fees instead of the previous platform token (see EIP 865 and Vlad’s post against economic abstraction).

While we can’t predict the future, we can make a good case for why there should be thousands of specialized protocols.

  1. Hardware and location: Different use cases may require different hardware. In some cases, nodes may have to be provisioned in specific locations. Instead of one single protocol trying to complicate their system with different node types for different roles, it seems to be a much cleaner solution to have different protocols that solve specific tasks and have a mining community with hardware that suit those requirements.
  2. Specialized skills: For good or for worse there are a limited number of people in computer science/economics specialized in a certain subfield, even less so within the blockchain community. It makes more sense to have them devote their attention to perfecting one single purpose-built protocol than have them spread across projects.
  3. Simple and stupid: Purpose-built protocols are easy to reason about at design stage, don’t have conflicting token incentives and simpler to formally verify and maintain. The more bloated a single protocol is, the harder it is to navigate around and the harder it is to attract specialists in a particular area (hobbyists/volunteers) to specifically look at certain modules (higher cognitive load and lower personal bonding to the project due to a far more larger and diversified community).
  4. Incentives: In continuation to the point above, the more diverse/all-encompassing a protocol tries to become, the more spread across the token is in terms of its holding as well as backing. While it can be argued that the diversity helps derisk the token, the flip side is that not only is its value harder to reason but strong supporters (developers/investors) of a token for a particular use case are exposed to business/development risks of the token’s/protocol’s other use cases. They are better off investing their time and money in what they truly believe and invest in other tokens to hedge/derisk themselves.
  5. Reusability: Ultimately, purpose-built protocols offer a better abstraction for being reusable. If protocols try to follow a monolithic architecture, they are no different from the Googles/Microsofts of today. If they try to break their own protocol into different sub-protocols to allow reusability, they may as well team up with the protocol that competes with their sub-protocol.

The guiding claws of CryptoFlaws

A protocol explosion has been and most likely will be led by teams featuring a mix of novices and experts. Ethereum started with the culture of move fast and break fast until the DAO fiasco and the following hard-fork made its maintainers realize the cost associated with half-baked systems. While slowed (for good?) by its past experience and the large community it needs to placate, several comparatively smaller projects are trying to come up the way Ethereum once rose up to Bitcoin. The reasons allowing them to claim to be progressing faster than Ethereum are strikingly similar — poorly thought out half-baked solutions and lack of a strong developer community placing checks and balances allowing small teams to make decisions swiftly.

As elucidated earlier, experience shows that due diligence on the system design front in the crypto space has been pretty poor. At the dawn of the crypto bubble, retail investors relied on brand name VCs/advisory firms/accelerators for technical due diligence. As the bubble burst and the all-weather, long-term crypto funds disappeared or converted to quant funds (slightly related news is OTC pre-listing of strongly backed Origo, Aergo etc), it became apparent that they had little insight or even interest into the technicalities and were simply relying on their brand name to hype projects to the retail crowd (greater fool).

Few VC firms have the technical team to carry out credible technical due diligence (for instance, a16zcrypto). There are a number of community-run sites that do a great job at explaining whitepapers to laymen. CryptoBriefing has been pretty consistent in technically critiquing projects especially with regards to their code bases. However, looking at their reviews, it seems as if even they aren’t particularly experienced in finding flaws with respect to system design and trust whitepapers with details several times. However, the devil is in details and appealing to authority isn’t an option. Journals wouldn’t publish the highest cited professor’s papers without blind reviews (and their papers do get rejected!) so why should you invest millions without an equally rigorous review? IMO, the smarter an author is, the smarter the reviewer needs to be to be able to intelligently critique.

The goal of this publication is simple-to publish flaws (not a well-balanced review) in the design of blockchain projects that have a risk of crippling them at a much later date (attackers are incentivized to attack only when the stakes are high). The hope is that not only will this help make more informed investment decisions but also help projects that mindlessly pick learnings from other projects (for example, the dPoS or masternode craze) to be more knowledgeable of the cons of such design choices.

Follow us on Twitter to learn about our first target.

--

--