Community Aggregation Theory 🐈
Exploring how the valuation models fail to account for protocol commoditization
This post explores how the valuation models used to evaluate decentralized service protocols leveraging “work” tokens fail to account for protocol commoditization. When this effect is considered, protocol tokens appear to only be able to capture a small portion of the value that protocol creates for consumers because as token prices rise, these protocols become inherently less efficient. This relationship between token price and protocol utility shows a fundamental misalignment between the interests of investors and consumers. In contrast, effectively governed community currencies can sustainably align the interests of investors and consumers through the aggregation of thin protocols, tools, and interfaces that enhance the utility and value of the community and its native currency.
Commoditization in Decentralized Service Protocols
In my post on governance and network effects I pointed to decentralization as a catalyst for commoditization because decentralized applications are being built on commoditized hardware, open source software, and public data. Some of the most interesting and compelling blockchain applications are decentralized service protocols, which facilitate commoditized markets for resources (database, storage, computation) and services (oracles, curation, sharing economies, exchanges). However, somewhat ironically, these protocols are typically valued without considering the effect of commoditization on token price.
The token associated with a decentralized service protocol may sometimes be called a “work” token, “staking” token, or “skin-in-the-game” token, because the protocol uses tokens as a means to grant a service provider the privilege of providing services on behalf of a network in exchange for fees. The token restricts participation as a service provider to only those that stake tokens, and the protocol may implement slashing conditions that provide an economic incentive for service providers to adhere to a specified quality of service. The result is a network of interchangeable service providers that compete to provide a commoditized resource or service to consumers.
Many investors are using the NPV model to value these types of tokens. They reason that as demand for the service grows, rational suppliers will be willing to pay a higher price for a proportional share of service rights on the network. The NPV models I have seen assume suppliers earn greater than normal profits and that tokens increase in value linearly with increased fee volume until the protocol reaches its total addressable market, frankly something that seems unrealistic.
In a commoditized market, marginal revenue for suppliers will rapidly converge with marginal cost and all suppliers that actively participate can expect to earn only normal profits. In order for a service protocol to remain competitive relative to competing decentralized protocols (including forks), the token needs to flow to the most efficient suppliers more rapidly than within competing protocols. One might assume that the most efficient suppliers will be willing to pay more for tokens than less efficient suppliers, but because acquiring tokens is simply an artificial capital cost, the most efficient suppliers will generally be the ones that pay the least for tokens. As token prices increase, the artificial capital overhead for suppliers will become more pronounced until a subset of suppliers justify selling their tokens and forking the protocol to undercut all other suppliers.
Taken to the extreme this process leads to an equilibrium where tokens are valuable enough to ensure that the protocol’s incentives are able to effectively enforce a consistent quality of service, while never becoming so valuable that the token becomes an efficiency drag on suppliers that results in a competitive protocol fork. This relationship effectively bands the feasible token price for a decentralized service protocol between a floor and a ceiling, regardless of the growth of demand for the service.
The band represents the speculative potential of the token and its size is proportional to the loyalty of the protocols consumers. If consumers are disloyal the speculative band will be thin, whereas if consumers are especially loyal then the protocol may be able to sustain a higher degree of supply side inefficiency related to high token prices. As a consequence consumers should see high token prices as a signal that a decentralized service protocol may be operating inefficiently, and investors should not assume linear growth relative to Adoption/TAM but instead based on an estimate of consumer loyalty to a specific protocol.
Teemu Paivinen wrote about this effect on protocols last year as an evolution in thinking around Joel Monegro’s early Fat Protocols thesis. He writes in relation to competitive market pressures and forking:
“As these forces push the industry towards more specialisation and forking allows almost unlimited competition, less the anti-competitive information and data advantages of the traditional technology industry, it would seem that protocols can only get thinner.”
The current trend from centralized services to decentralized service protocols is undeniably good for consumers who benefit greatly from the commoditization and disintermediation of value creation that comes at the expense of speculators and individual firms that might otherwise be able to secure more defensible market positions. Unfortunately, most of this protocol development may fundamentally be a public good, and public goods have historically not been particularly sound private investments.
If protocols get progressively thinner they may prove unable to capture venture scale returns, and as a result may fail to receive significant venture investment. In order to ensure the decentralization revolution continues full steam ahead, it’s imperative that we ensure that investors’ and consumers’ interests are aligned across the short, medium, and long term.
A simple way to align consumers and investors is for the investor to speculate on the consumer product itself, so long as the product is fractionally divisible. The obvious example of this is Bitcoin and other crypto assets that are valued as currencies rather than as commodities or work tokens. When speculators are purchasing the same asset as consumers, it creates a healthy community where investors and early adopters have an incentive to build infrastructure and related public goods that support and grow the utility of the asset. Unlike decentralized service protocols, price appreciation doesn’t adversely impact the efficiency or utility for the consumer.
Community currencies compete by differentiating themselves based on technical trade-offs, overall utility, political ideology, memetics value (DOGE! 🐶), and various other economic risk and reward vectors. These factors are reflected within a community’s governance processes and monetary policies. Currently, the most significant community currencies are deeply integrated with their respective base layer protocols. This approach constrains the economic design space because the currency’s monetary policy must prioritize the security of the underlying protocol rather then the utility of the currency. The result is a less efficient protocol and a sub-optimal community currency.
To illustrate this point, we can look at Ether which is valued based on its functionality as a currency as well as its future utility as a staking token. Ether’s dual use cases present an optimization challenge. If the Ethereum community chooses to adopt a design intended to make the supply of ETH illiquid it can improve security by making it more difficult to acquire large quantities quickly — even if the equilibrium token price is low. Since staking tokens represent a capital cost to participate in block production, a protocol that is able to provide the same level of security while minimizing token prices will be more cost effective for consumers. This isn’t a wildly new idea, as this approach to optimization is already being adopted by Cosmos. On the other hand, Ether’s utility as a currency is enhanced if supply is highly liquid. The rub is that Ether can either be optimized on this axis to promote cost effective security of the base layer (becoming a thin protocol) or promote usability as a community currency, but may struggle to do both at the same time.
Proof of Work is a possible solution to this problem, because the economic security is not directly dependent on monetary policy. However, compared to Proof of Stake, Proof of Work is incredibly expensive. If Proof of Stake demonstrates to be feasible, community currencies that are built on top of a more cost effective base layer will have a huge advantage. Regardless of where you stand on the PoW or PoS debate, we can clearly see that separating the protocol layer from the currency layer allows for the creation of more compelling community currencies. In practice, this could mean that it is worthwhile for Ethereum to consider splitting ETH into a staking token and a fixed supply currency, or simply embracing economic abstraction by allowing gas and other fees to be paid by many different tokens.
Specialized community currencies parallel earlier thinking about application specific medium of exchange tokens, which myself and others have criticized in the past because they don’t reliably capture value for investors due to the velocity problem, and generally create a bad user experience. The key distinction with a community currency is that instead of suppliers of a service protocol enforcing a arbitrary medium of exchange, community currencies use monetary policy and governance to create an asset which caters to the specific preferences, values, and use cases that matter to a community of consumers.
Community Governance and Aggregation Theory
This distinction can be framed in the context of Aggregation Theory, where value is captured by nurturing the loyalty of consumers through the systematic aggregation and commoditization of suppliers. With an established and loyal community, any protocol that is running inefficiently can be forked in order to create net value for the community. Some investors may be expecting decentralized service protocols to operate inefficiently due to inherent network effects or switching costs. However, a large community of consumers may be able to overcome any incumbency advantage the protocol may have. That is the power of a strong and fiercely loyal community, and why I suspect value will tend to accrue primarily in the currency-governance layer rather than in any other protocol layer.
By combining the currency use case with governance, communities will have a huge advantage in effectively aggregating the economic activity of the ecosystems that develop within and around them. If a community desires, governance can be used to socialize the cost of provisioning public goods through inflation or transaction fees, including the systematic forking of “thin” protocols that have grown too “fat”, funding the creation of novel protocols which commoditize new markets, or building applications and tools which further enhance the utility of the community’s currency.
Since there are many communities that represent different values and ideologies, the community currency use case is not winner take all. Some currencies may choose to support fund recovery mechanisms, some may trade higher transaction costs for improved anonymity and fungibility, some may optimize for better wealth distribution and adoption through UBI, and some may seek to maximize price stability. As interoperability and decentralized exchanges reduce the cost and friction of more frequent currency conversions it will become easier for community currencies to differentiate by optimizing for their community’s unique values.