The externalities of blockchain protocols

Drew Stone
Commonwealth Labs
Published in
10 min readDec 10, 2018

When you dive into the mechanics of blockchain protocols, hoping to understand the inner workings of this complex system, you somehow find yourself researching an even broader set of topics. Topics ranging from common resource pools to the dynamics of multi-agent systems are only a few that you may stumble upon, depending on the angle of your approach.

The economic modeling of blockchains is undeniably an interesting topic. At a high level, the blockchain is a multi-agent, economic network that is constantly changing. Parameters for the block-size, difficulty, and cost of transacting are constantly evolving through obscure voting and aggregation procedures. Each of these parameters triggers a cascade of reactions across the market of participants.

The price war for getting transactions included in blocks and processed throughout the whole network is rooted in a two-sided common resource problem. On one side, users must compete for transaction space in a block, and on the other side, miners decide how large blocks should be. Simultaneously, miners selectively include the transactions they want. On most existing platforms, users participate in a first-price auction to get transactions into blocks, sending fees with defined prices. Miners, however, can act in a variety of ways; for example, miners can take bribes for priority access to available transaction space or act altruistically, strategically, and even maliciously towards the users and global network altogether.

In addition, blockchain protocols rely on validating transactions over the entire network. This is accomplished by having every miner validate every transaction, a wasteful but important process that ensures safety throughout the consensus process. Therefore, the size and contents of a block induce even more externalities on the global network. A block proposing miner’s actions induce externalities over all other miners and validators in the system.

Definition: An externality is a cost or benefit that affects a party who did not choose to incur that cost or benefit. [1]

Common pool resources

We will begin by deconstructing some of the market mechanisms that blockchains exhibit, hoping to build a robust foundation for optimizing the network. Let’s start with a concept that underlies this complex economic system, a common pool resource.

Definition: A common pool resource is a type of good consisting of a natural or human-made system, such as a fishing ground, whose size or characteristics make it costly, but not impossible, to exclude potential beneficiaries from obtaining benefits from its use. [2]

A fishing ground is a good example of a common pool resource. Since there is a finite amount of fish in the world, every fish caught takes one more fish away from the rest of the fishermen/world. Catching fish induces a cost on the rest of the world since it makes it potentially harder to catch fish. This concept makes fish a rivalrous good.

Definition: A good is rivalrous if its consumption by one consumer prevents the consumption by another consumer. [3]

Similarly, we can extend this to the blockchain ecosystem. If we are talking primarily about blocks in a blockchain — the data structure that contains transactions — then space or equivalently adding transactions in a block are common pool resources. If space in a block is used by a transaction, then there is less space available for the rest of the users who wish to include their transactions. This makes these goods rivalrous and as such induces a negative externality on the world of users.

Furthermore, given a block limit L, there exists a well-defined common-resource problem. When space fills up in a block, the cost of transacting increases. Similarly, if L decreases, the resource faces more pressure from less available space. The marginal cost of transacting should be higher as this occurs and lower with each additional transaction when L increases.

It might make sense then to set L to a really high limit since, as L grows, we process more transactions. This happens to not be the right solution because we have yet to introduce the effect changing the block size or limit has on the network: the negative externality introduced by pollution.

Pollution

Pollution in the blockchain sense and more generally of all distributed systems is related to congestion. When there is too much information being spread around a network, nodes who cannot process information quickly enough will fall behind. If the processing of information is time-sensitive, the cost of participating in the network will grow and consequently push weaker nodes out of the system.

On a blockchain, this translates to the effect that processing large blocks have on smaller nodes and validators. From UTXO blockchains to smart contract blockchains, different transactions have different sizes, different executions, and definitely different processing times. There exist transactions with the same sizes that have different validation times. Any permutation along these lines exists.

Traditionally in the real world, we solve pollution problems with taxes and subsidies; that is, payments issued by some central authority to individuals and corporations to do the following:

  1. Produce less pollution-creating goods.
  2. Produce technology that reduces pollution, by replacing the good or improving the pollution costs of producing the good.

This leads us to a few outcomes with respect to the block size and contents:

  1. We subsidize producers of small blocks.
  2. We tax producers of large blocks.
  3. We innovate on the storage, processing, and/or pricing of transactions.
  4. Quantum computers HA!

Ultimately, networks benefit from both positive (network growth) and negative (pollution) networked externalities. There are even second-order effects from such externalities. Network growth compounds on itself as capital and technological innovation flood into the network. Pollution leads nodes to stop participating when the costs outweigh the benefits, decreasing the utility generated by the entire system. Without any formal targets over network outcomes, it remains difficult to select an optimal market configuration.

A rough economic model of externalities

Consider a blockchain model with a fixed capacity L for the block size. In addition and as with all blockchain protocols, we will analyze the execution of the protocol. At each round, a new block is published. Each round has a target execution time T; we assume the nodes of the underlying distributed system have loosely synchronized clocks so that rounds in expectation occur every T time units. We also assume we have access to the unbiased, untampered processing or validation times for each node in the distributed system of each rounds’ block. We represent this set by a vector: block_times(r)=(t(1),t(2),…,t(N)), where r represents the round and t(i) represents the time it took node i to process the block of round r.

Note that nodes i in the network will cease to participate when the cost of validating new blocks is larger than the utility they derive from doing so. Nodes not currently in the network will not join if the cost of validating the entire history of blocks — the blockchain — is larger than the utility they gain from being a participant.

Model

We start by formalizing a simple model:

  • There are N participants who all process the blockchain with no delay.
  • Each participant i in N, has some utility function U(i,B) defined as the difference between the value they derive from processing the blockchain B and the cost associated with its validation.
  • Every round r is composed of 2 stages. The first stage of round r consists of publicizing block_times(r-1). The second stage consists of publishing the r’th block to B.
  • All participants are homogeneous; meaning, they run the same hardware except they may be scaled up in terms of computational power. We assume then that a participants capacity is drawn from some (potentially unknown) distribution D. We denote an agents type by its index of the type vector T.
  • Under the randomness of the distribution D, we define each participant’s capacity as a function that maps block size/limits to agents’ validation times. Given a limit L and an agent i, this can be defined as C(i,L)=c(t) for t~D.
  • We assume access to an oracle O that when given a block limit L returns the (expected) validation time for blocks of size L.
  • We assume there is some central planner P that when provided with block times block_times(r-1) returns a block limit L* for the round r.

Proposition: For a block limit L, the number of nodes that drop out after the first block is released can be defined using the following.

  • Indicator functions I[C(i,L)O(L)].
  • Utility functions U(i,B) = V(i,B) - C(i,L).
  • Agent i drops out if and only if U(i,B) < 0.

It follows that the total number of nodes that drop out after the first round is equal to: n * Pr(I[C(i,L)V(i,B)] = 1), where Pr(E) denotes the probability of an enclosed event E.

Using linearity of expectation, we sum the expectation of the indicator random variables I for each agent i. The expected number of nodes that drop out is exactly the expected number of nodes whose cost over validation time is higher than the value they derive. We abuse the notation slightly, but aim to capture the following idea: if new blocks have capacity L, then nodes who derive value in accordance with blocks of size greater than L will continue to validate. Nodes that have positive valuation from blocks on size less than L will drop out. This model allows us to capture a per-round sequential decision-making process by strategic agents, where negative utility indicates non-participation in the following round.

Now if we assume nodes are willing to tolerate negative utilities for k rounds, we can use the central planning oracle to adaptively selection block limits.

Proposition: Let k be the number of rounds any agent is willing to tolerate negative utilities before dropping out. If we want all N nodes to participate in the protocol for total time TIME as TIME → , then we want to choose a planner P that outputs block limits L(r) that solves the following problem.

  • P uses block_times(r) to solve the optimization problem below.
  • Maximize the throughput (sum of block limits L(r) for all r) subject to the constraints that no node is allowed to have negative utility more than k total times.

With a planner P as above, we can optimize the throughput of the chain without pushing any nodes out. We require knowledge of the minimum value k that incentivize nodes to stick around and also untampered reports of the block_times(r) so that the planner can optimize its selection of block limits, knowing how certain changes affects all participants.

While this approach is very appealing in that a central planner learns to optimize the network’s parameters, there does not exist any central planner in a cryptocurrency protocol. Similarly, we cannot assume we have access to untampered, truthful block times provided by the participants. Worse, we can’t even implement policies that guarantee that the externalities are limited in their effect on all participants. To that end, we must design these solutions in a decentralized system with strategic and byzantine participants.

With an actively governed protocol — one with on-chain governance — we can start decentralizing the planner P. Using voting mechanisms and information elicitation mechanisms, we can elicit truthful validation times from participants under the assumption that f nodes are byzantine and the rest are strategic. These tools will provide the first steps to an adaptive, governance mechanism that optimizes a network with negative externalities.

Future changes

There are many more ways to analyze the dynamics of the market that don’t rely on valuations based upon the capacity that nodes want to tolerate. In most cryptocurrencies, blocks have an expected reward determined as the sum of transaction fees plus the block reward that can be earned by mining nodes. Therefore, we can define a node’s valuation to be non-zero with a certain probability — the fraction of power a node has — and zero otherwise, depending on whether they are mining nodes. Then, the participation, per-round sequential decision-making process evolves with respect to the expected utilities of the participants. Similarly, if nodes assume different roles, such as mining versus only validation, their costs change. Thus, it becomes a complex and challenging task to define the entirety of the blockchain externality economy.

Even further, we have made no assumptions about how the economy grows. In traditional microeconomics, if there is a demand for greater quantity in the market as well as the possibility for profit, new firms will move in to eat up that surplus. In the blockchain ecosystem, this translates into the arrival of new mining and validation nodes. In a more nuanced way, even non-mining, full nodes join the network, indicating that certain market participants derive non-zero utility from earning absolutely no money. How should we model the arrival of these participants and how do we analyze the market mixture with respect to the storage and validation of the underlying blockchain B. What levels of L and what fractions of the network power limit the growth of both non-mining and mining participants, respectively. We leave these questions for your imagination and interpretation, though I would love to talk with anyone also thinking about these questions.

Conclusion — Social Planning

In an economy with negative externalities, there is a need for a social planner to optimize the economy’s output. Governments intervene in product markets with pollution and exploitation of natural resources. They provide subsidies in markets with positive effects against some catastrophic market failure or taxes against markets to prevent market failures. In a decentralized peer to peer network, however, no established government exists. Instead, we are given a dynamic government with sometimes improper governance tools.

What the best tools are for achieving optimal outcomes is an active area in the blockchain space. As many protocols upgrade to tackle scalability and security, it is important that they simultaneously upgrade their governance procedures. At a high-level, it is useful for scalability and security, to have a governance process that acts as an optimal social planner. The topic has been studied in a variety of settings with respect to central social planning and distributed, strategic social planning with a central planner, but doing so in a completely decentralized and byzantine manner is not well defined. We leave discussion of this for next time!

References

[1] Buchanan, James; Wm. Craig Stubblebine (November 1962). “Externality”. Economica. 29 (116): 371–84. doi:10.2307/2551386

[2] Common-pool resource. (2018, June 15). Retrieved from https://en.wikipedia.org/wiki/Common-pool_resource

[3] Rivalry (economics). (2018, November 19). Retrieved from https://en.wikipedia.org/wiki/Rivalry_(economics)

--

--

Drew Stone
Commonwealth Labs

Tech @ Edgeware | CTO @ Commonwealth Labs | NETS @ UPENN