Evolving the Protocol

Matthew Werner
9 min readJun 20, 2019

--

“Why are you so excited about this so-called ‘Internet Money’?”

This is a question I get a lot from people who have a hard time wrapping their head around crypto and how it will impact the Internet. They see crypto as an unnecessarily complex alternative to Venmo or Square and don’t see any reason to leave their walled banking garden.

With Libra recently announced, the conversation has come up again. Fred Wilson reiterated his view of the role cryptocurrency will play:

When USV invested in Coinbase in early 2013, our rationale was that digital currencies and digital assets (like Bitcoin and beyond) were a breakthrough technology, similar to TCP/IP, HTTP and SMTP. But we also knew that it would take significant investment in the surrounding infrastructure to make them useful for businesses and consumers, just like it did with the Web back in the 80s and 90s.

Cryptocurrency poses a turning point for the internet. It allows us to make high value functionality natively and ubiquitously available within the protocol itself. To understand why let’s take a brief look at how protocols have changed over the past few decades, what constraints we’ve hit, and how cryptocurrency changes everything.

Booting the Internet

It’s important we begin by looking at our Internet’s transmission control protocol (TCP). TCP provides functionality that is fundamental to the Internet, not just that it enables computers to communicate with one another, but the basic goals it achieves while doing so:

  • Network connectivity. Any network could connect to another network through a gateway.
  • Distribution. There would be no central network administration or control.
  • Error recovery. Lost packets would be retransmitted.
  • Black box design. No internal changes would have to be made to a network to connect it to other networks.

If any one of these tenants didn’t hold with TCP, it would not be the ubiquitous backbone it is today. Imagine a Florida business that wants to connect to the internet had to run a line to New York to connect. Suppose changes had to be made to the core protocol configuration to accommodate each new network participant. The technology would not have been flexible enough to accommodate the expansion we’ve seen in the years since and serves as a great example of the power of getting the technology right.

TCP, for all intents and purposes, was the advent of decentralized technology. You were able to connect to any part of the network and reliably communicate with other members of the network with no central organization facilitating your participation.

Building on our transmission controlled network

TCP has done its job well; it reliably transmits messages over a network. In fact it’s only seen minor changes since its v4 specification in 1981. Message passing is limited in its functionality though, and almost twenty years after TCP arrived on the scene, a new protocol emerged to satisfy an interest in doing more with this technology.

In 1989, Sir Tim Berners-Lee while working with CERN, wrote a proposal for “Mesh”. It was a new protocol that allowed for entire documents to be transmitted over TCP. Mesh would later be renamed to the more catchy World Wide Web. If TCP was the advent of decentralized technology, HTTP was the advent of the consumer Internet.

The introduction of HTTP didn’t compromise the decentralization of TCP. Anyone in their home can start a web server and serve documents directly to other requesting computers. We start to see some centralization around web hosts due to the difficulty of operating reliable servers in the every day household, but the technology itself isn’t centralizing. TCP and HTTP do not depend on a central organization to facilitate its use.

To move beyond simple document retrieval, we have to discuss the limitations of these protocols and how we overcame these limitations.

Bringing applications to the web

There is an idea of “state” in computer science. It represents the idea that your actions while interacting with a given piece of technology persist between usage. Remembering your past interactions with an application unlocks massive amounts of utility for the user. You can start building applications that keep track of what you had previously done, authenticate access to specific data, and use that remembered context to perform more useful functionality.

gmail providing an early web application. matusiak.eu

This evolution of HTTP makes sense, we want to do more with the world wide web than simply fetch unchanging documents. By laying state on top of HTTP, we can realize a dramatic improvement in utility while reaping all the network benefits provided by the World Wide Web. The question then is, why wasn’t this functionality provided as an evolution of the protocol?

HTTP was introduced to provide new functionality to TCP. It was the state persistence that required a centralized application layer. There was no decentralized way to provide a shared data layer without moving the functionality under the purview of an organization building the application.

For example, Amazon, as early as 1998, persisted your interactions with their website to improve your user experience when shopping there. It provided you with the ability to save credit card information, view past orders, or keep shopping lists. Amazon is able to control who has access to which data and can ensure no random malicious user is tampering with another user’s data. This ensures the integrity and security of the data that is being used to store state on top of the HTTP protocol. Without Amazon providing this security, anyone else that was using the network could access and change your data. This level of utility could not have been provided without the stateful application layer.

The amount of application development on the web speaks to the potential value of building on the TCP and HTTP protocols. To access the latent value in these protocols, we built centralized applications rather than finding a way to safely store state in a decentralized way. It was safe use of data that forced web technologies towards centralized solutions.

Our centralized web

Protocol development has slowed as the lion’s share of new functionality has been built in the application layer. This is due to both the ubiquity of our applications’ dependency on state with the aforementioned constraints as well as the economics behind the return on investment in applications relative to protocols.

Around the turn of the century, we started to see much more sophisticated functionality being introduced to our web applications. Under the banner of Web 2.0, we saw the rise of a myriad of new technologies that would improve the functionality of the application layer. The rise of Javascript, the plethora of web frameworks, object relational mapping, and a host of other concepts were brought to the web. All of these advancements are focused on moving and storing data faster and more reliably. Our application layer has become extremely fine tuned to manage state.

Managing the data of Internet users has realized extreme amounts of value, giving rise to the FAANG tech behemoths. Many of these companies’ products are free to use, made possible by leveraging their users’ data. I’m not arguing this is an inherently bad thing, just recognizing the amount of value realized by building reliable stateful applications on top of the HTTP protocol. The utility made possible by these applications has changed our society to be more connected, productive, and capable.

It’s no wonder web technologies have received so much investment over the past twenty years since the application layer started to take shape. Software companies have been eating industry after industry by making use of the transformational technology the original protocols provide. There hasn’t been nearly as much money to be made in developing open protocols, but that might all change with the introduction of cryptocurrency. Albert Wenger wrote about this dynamic back in 2016:

I can’t emphasize enough how radical a change this is to the past. Historically the only way to make money from a protocol was to create software that implemented it and then try to sell this software (or more recently to host it). Since the creation of this software (e.g. web server/browser) is a separate act many of the researchers who have created some of the most successful protocols in use today have had little direct financial gain. With tokens, however, the creators of a protocol can “monetize” it directly and will in fact benefit more as others build businesses on top of that protocol.

If we are able to store state natively in the protocol, would this be a desirable advancement of the protocol?

Evolving the Protocol

Cryptocurrencies offer an evolution of our technologies’ capabilities the same way HTTP brought us the world wide web. The ability to build stateful functionality directly into the protocol itself without any central organization securing or storing it is a major breakthrough that will fundamentally change both the capabilities of the technologies we build and the business models available to the builders.

Joel Monégro referred to this shift back in 2016, calling them “Fat Protocols”. By introducing a shared data layer natively in the protocol we can satisfy the dependencies of many applications. This could mean a shift to application-like functionality made as readily available as the Internet itself. The transmission of data from one computer to another is so fundamental to the functionality of the internet, we don’t distinguish between the two. In the same way transmitting data is native to the protocol, we will see higher level functionality become native to the protocols we use in the future.

Native availability will drive investment

As the cryptocurrency industry matures, we’ll see developers drive progress akin to the dot com boom or the mobile gold rush. The economics of introducing new native functionality to the protocol is an attractive prospect. A team can tackle any number of established high value applications, move them to a protocol and be compensated well for their work through their token offering.

Fred Wilson discussed the economics of protocol development in an article posted 2016 and called the ICO crazy of 2017:

In this emerging model, Twitter could have adopted a protocol-based approach and issued a crypto-token, Twokens, that users could earn from things like amassing followers, reporting abuse, etc. Twokens could also be sold by the Twitter founding team to finance their operations. Crypto-exchanges could make a market in Twokens so that anyone who wanted to speculate on the future value of the Twitter protocol could do so.

Photo by Moose Photos from Pexels

This new model isn’t without its flaws though. There can be a problematic incentive structure for development teams that have already launched their asset. Issuing tokens used to interact with the protocol could be deemed as securities by regulatory bodies. By tracking balances of an asset used to work with an open protocol, when a third party decides to fork the network, the value attained by the developers can be volatile.

Open protocol development will only accelerate

Forking an open cryptocurrency protocol is a complex and relatively novel concept covering the history we’ve discussed here. Traditionally, when software is forked, it doesn’t affect the functionality of the existing software. Forking software has been a critical part of open source development. If a piece of software can be implemented better and someone has an interest in improving it, they can “fork” the software, copy it and modify it.

Forking will be even more important in open protocol development. There is a financial incentive to executing better on a valuable protocol. Developers will be directly benefited by their work on improving the applications we’re using. This will drive far more involvement than traditional open source development and will accelerate the evolution of these protocols.

The web applications we use today are a work around to the shared data layer problem. Equipped with cryptocurrency, developers now have a way to build high value functionality directly into the protocol and being properly compensated directly from the protocol they create.

--

--

Matthew Werner

Former Head of Crypto Engineering at Coinbase. Ex-Zendesk.