On the immaturity of tokenized value capture mechanisms

Pursuing value in an age of borderless-ness, experimental monetary policies, costless forks and unlimited innovation.

Felipe
Paratii
21 min readApr 1, 2018

--

Photo by Carl Raw on Unsplash

Value capture is a topic that always seemed a bit overlooked in business modelling. Traditionally, as Peter Thiel frequently points out, there’s little correlation between value creation and value capture (e.g. one may generate tons of revenue without really profiting from it). Some industries have even established dynamics that clearly separate value creation from capture: think of the film industry, with production companies doing all the creative and operation work on one side, while, on the other hand, distributors and exhibitors take 80–90% of the share of profit, at the end of a movie’s life cycle.

In cryptoland, value capture is even murkier. A post by the Dharma team recently put forth (and got me thinking about) the question whether “any current token designs will reliably accrue value even if their parent products are successful”. There’s a good degree of consensus around the overvaluation of assets in the market; a lot of decent discussions around valuation methodologies aiming to frame the madness; but not much rational talk about the root issue:

do cryptoassets really capture the value of something, at all?

1. What is value?

Value is created through work. Work can be mechanical, creative, or anything in between. In “The Origin of Wealth”, Eric Beinhocker scientifically defines the creation of economic value in three key criteria: work must be (thermodynamically) irreversible; reduce local entropy (within its ecosystem) while increasing global entropy; and produce artefacts or actions that are fit for human purposes (note how blockchain-based work fits perfectly here).

by Adioma

1.1. What is value capture?

P/E rations of top ~100 global internet stocks (in 2014). Wondering about the lack of uniformity? They all capture value differently!

Extending the definition above, the purpose of a business can be seen as that of creating value (through work), selling or trading it to customers, and capturing some of that value as profit.

Value capture is the link between profits and revenue; essentially the “glue” of P/E ratios.

1.2. Approaches towards value capture

data from “Measuring the Moat” (2013), by CreditSuisse.

HBR has a great article by Professor Stefan Michel called Capture More Value. It lays down a framework for innovating of value capture in 15 distinct forms, from changing the price-setting mechanism (e.g. fixed price vs. auctioning) to changing the part of the product/experience on which the price tag is hung on (e.g. coffee packets vs. Nespresso capsules)

Generic patterns can also be extracted by comparing the asset turnover vs. cash flow return on investment (CFROI) of a broad enough range of stocks. Visualisation makes it clear that some businesses capture value through massive scale and thin margins (production advantage), while others rely on large margins (high pricing power towards consumers) and generate return with lower asset turnover.

1.3 Profit pools (value-capture heterogeneity)

Visualising profit pools: the evolution of value capture in the healthcare industry, throughout a decade (source).

A projection exercise that’s relevant to cryptoland is that of dissecting industries in “profit pools” — i.e. to compare the profitability and share of industry revenue across given industry sectors, seeking for those that are more effective in value capture.

In the film industry, for example, as it hypothetically starts to decentralise itself into tokenised services, this would mean breaking down a movie’s journey from producer, to distributor, to festivals, to theaters, to subscription catalogues, to home-video, to cable TV, to advertising-VoD… and tracking who gets the larger pieces of the cake.

It can be interesting to take the question further, by conceiving profit pools within sectors: like splitting advertising-VOD down into transcoding, storage, content management, distribution, permission, and billing services, for example, thus estimating the value accretion potentials of video-related tokens aimed towards specific parts of this pipeline (let’s leave that to another post).

2. On the value of financial assets

An asset’s value is the present value of its (expected) future cash flows.

The value of a traditional security-like financial asset is straightforward to model, and usually translates into the famed discounted cash flow equation. Its subset of variables is widely discussed in the financial literature, and, although most of them turn out to be hard to predict in practice, empirical data surfaces enough correlations for investors and analysts to rejoice in their models.

In the case of security tokens, the model may still apply. In the case of everything else in cryptoland, it doesn’t.

3. What do token valuation frameworks say about value accretion?

There’s still few - albeit exceptionally smart - people studying valuation models for utility tokens. A handy summary follows (note that the exercise of valuation encompasses the matter of value capture, but is broader in nature):

  1. Chris Burniske: sets the equation of exchange (MV = PQ) as a cornerstone to evaluate cryptoassets (those which are means of exchange, stores of value and units of account for their protocols), basically proposing we find the GDP for a given cryptoeconomy and divide it by the circulating token supply.
  2. TwoBitIdiot: disbelieves most utility tokens out there, but praises some of them as likely to endure — specially, proof-of-human-work tokens and TCRs, which are more likely to maintain subjective differentiation.
  3. John Pfeffer: suggests equating the Marginal Costs of running a decentralised network to its Marginal Revenue (MC = MV) in order to find a point of equilibrium (potentially ignores a lot of unknowns, as Johnny Antos points out).
  4. Brendan Bernstein: criticises absurd valuations and thinks about value accretion along two dimensions, also upon the quantity theory of money (see chart below).
Brendan Bernstein’s “Making Sense of Crypto Asset Valuation Insanity”: the quantity theory of money as a framework for reasoning about crypto.

The overarching but tacit idea here is that these assets function as an exclusive form of payment in exchange for a network’s underlying scarce resource. However, it might not be appropriate to make such conservative assumption, specially when it comes to utility tokens.

In face of the “rise of Stablecoins” as more efficient means of payment, cross-chain interoperability efforts, and even proposals for stuff like paying for gas on Ethereum with ERC20s, it seems unclear that any cryptoasset will be able to fence-off alternative currencies as means of payments within their networks. Instead, we should probably be looking at which tokens will people (or machines) choose to hold or to spend.

4. What does the crypto market say about value accretion mechanisms?

Very likely, it doesn’t give a 💩 yet. In the stage we’re at, while speculation-to-utility ratios still lurk above 90% for most tokens, prices on secondary markets are dictated by the attribution of value to a loose range of aspects, not directly related to value capture (since there’s little value being captured outside of Bitcoin, Ethereum and a handful of others, at all).

Look at #TRX. It’s an aberration. But it seems to have higher-than-average correlations between reddit active users (or new comments) and token price. Besides, no other publicly measured metric correlates to price as strongly as these (see Github commits x price, on the right, below). Some other studies suggest deep-rooted correlations between cryptocurrencies’ prices and social metrics. That says a lot about what do some portions of the market currently attribute value to. A layman is not to blame for trying to find his own proxies to foresee “network effects”.

#TRX’s correlations between price and Reddit New Comments, Reddit Active Users an Github Commits, in past months, via CoinShilling.

5. How are utility tokens claiming to accrue value?

Tokens which are uniquely required to incentivize or disincentivize behavior in order to provide a service accrue value relative to that services utility”.

The definition above was put forth by Luke Duncan. It’s palatable and useful, even if very theoretical. Three key points: “uniquely” (is the token unique for incentivising the underlying services of the network? How much competition is there for the same ‘mining power’ being tapped?); “incentivise or disincentivise” (how accurately/fully does it capture the value of underlying behaviours?), “utility” (how much real world demand can there be for such underlying services?). Let’s have a closer look on the second one.

There’s a handful of attempts in classifying tokens according to their origin / genealogy; others in trying to stack them according to their legal status, but few that aim to dissect and categorise the value accretion mechanism behind each asset. Kyle Samani’s New Models for Utility Tokens is a piece that stands out, here. We incorporate the ideas proposed and present an extended visualisation below:

5.1. Stores of Value

The category certainly includes Bitcoin, and might include Decred, Monero, or other general-purpose tokens with independent, free-floating monetary bases. They should be valued using the equation of exchange, in the prism of quantity theory of money. Some argue that Ethereum, Dfinity and further smart-contract-platform-native-tokens have a concrete chance of becoming useful to the point of emerging as a store of value too.

5.2. Security tokens

Assets that yield passive revenue, to which one can apply variants of the DCF valuation model. Think Sia TechSIAFUND token.

5.3. Utility tokens

5.3.1. WORK TOKENS
Pioneered by Augur, this model presumes distributed contributors must stake the native network token to earn the right to perform work for the network. The chance that a given one is awarded a job request is proportional to its amount of staked tokens, as a fraction of all tokens staked by competing contributors at the time. If the work is deemed “correct”, it gets rewarded with fees; if “incorrect”, its supporting stakes can be slashed.

  • Examples: Augur, Numerai, Filecoin, TrueBit, Livepeer.
  • Value accretion: strongly driven by competing demand among contributors/providers. As demand for the underlying service grows, revenue flowing to service providers increases. Under a fixed supply of tokens (or a monetary base that grows slower than the demand for the service), contributors will rationally pay more, per token, for the right to bite a larger part of growing cash flow streams. Kyle Samani’s back-of-the-envelope calculation proposes this model, by design, accrues ~100x more value than that of the means of payment token.

5.3.2. TCR TOKENS
Under a generic framework for distributed curation, TCR tokens confer holders the right to adjudicate over the contents of a registry. Such registry usually requires a minimum stake for new listings to come through, it puts in place propose-challenge mechanisms for any token holder to curate over listed items, and it redistributes stakes between token holders aligned with their pairs every time challenges happen or stakes are forfeited.

  • Examples: AdChain, Paratii, Medcredits, Relevant.
  • Value accretion: if a list accurately represents its focal point; and (2) this focal point is of interest to certain audience(s); TCRs tokens will accrue value proportionally to the value that listees earn by being in the list. This can be measured differently according to the use-case: in AdChain’s example, one can assess/project the difference in revenue between whitelisted and rejected applying publishers. Besides, whenever there’s loss of value due to entropy (a list loses relevance to its audience), token holders can recapture this by participating in value-relocating propose-challenge games.

5.3.3. ACCESS-BASED TOKENS
A model with two tokens, one meant for staking, the other meant for paying fees, where the former (primary token) functions as a fee credit (secondary token) generation machine. The amount of secondary tokens minted for every primary token staked varies according to platform usage, usually giving access to a service at a fixed cost - the effect being similar to that of holding a license.

  • Examples: Gnosis, SpankChain, VeChain.
  • Value accretion: end-user’s demand for tokens grow if platform’s usage increases, and it is perceivable that the “credits” they are entitled to become worth more value. Note that one can make a one-time purchase of a sufficient amount of primary (e.g. GNO) tokens and increasingly use its underlying services as demand for them grows.

5.3.4. PROOF-OF-BURN TOKENS
Tokens function as a means of payment, but instead of circulating in the form of fees, they are burned, meanwhile referencing the service provider that each “payment” refers to, and entitling them to receive a portion of fixed block rewards, according the amount of “burns” that reference each service provider in that block. The cleverness of the design lies in the fact that the cost of using a protocol can remain fixed and “miner-independent” to end-users (in Factom, U$0.001 per entry), while value capture happens heterogeneously across service providers who worked less or more.

  • Examples: Factom uses the scheme described above (within its own double-token model); some other tokens incorporate “proof-of-burn” also as one of their issuance mechanism, like Blockstack’s stacks or Counterpary’s XCPs.
  • Value accretion: if platform usage is burning more tokens than the rate of issuance, supply decreases and pushes prices up. Vice-versa. In the long-run, there should be linear relationship between token value and platform usage.

5.3.5. “BUYBACK” TOKENS
Tokens that, beyond a core functionality, are meant to be bought back by their issuing entity, and (usually) then burned, making “buybacks” a more effective alternative to dividend distribution: value is not assigned directly, but rather indirectly, by reducing supply and increasing the ownership share of the network for all existing token holders.

  • Examples: ICN, Refind.
  • Value accretion: happens through direct reduction of the outstanding supply, thus increasing individual network ownership. Creation is somehow presumed, when a token is set to be bought back by its issuing entity (Iconomi had seen staggering growth of reserves when it announced it; Refind plans to do it after reaching certain revenue milestones).

5.3.6. PERPETUAL DISCOUNT TOKENS
Tokens that entitle holders to a discount mathematically equivalent to a revenue-share, but that only comes to effect when they utilise the network’s underlying services. This is a royalty model that grants no rights to a fixed stream of cash flows, but rather rights to consume a proportion of total services offered at any given point in time.

  • Examples: TAY, Sweetbridge.
  • Value accretion: the size of the discount that each token realises for its owner is designed to grow in step with the overall usage of the network. But most of this value accretion is captured by individual token holders when they exercise their discounts - passive holders are by definition under-exploring the tokens’ potential, able only to capture their resale value, but not their discount value.

5.3.7. ONE-TIME DISCOUNT TOKENS
Probably the first iteration of the perpetual discount token model, there doesn’t seem to be many iterations around, lately. It’s the exact translation of the “paid-API key” model, where a token entitles its holder a one-time right for a service. It’s fading away because it accrues far less value than its perpetual counterpart.

5.3.8. TRIPLE-TOKEN STABLE SYSTEMS
The only example I’ve seen functioning to date is Steem. The basic idea of the model is to isolate two forms of capital commitment towards the network: debt and ownership. Then, each is assigned its own cryptoasset: one that gives profits if the community grows but loses it when it shrinks, the other guaranteeing some interest but not participating in any profits. One can be converted into the other according to vesting mechanics, and, ambitiously, a third stable-price asset complements the system as a means of payment.

  • Examples: STEEM, SP (Steem Power), SBD (Steem Dollars). The unit of account (STEEM) is the point of entry to the system, and, besides being traded, it can be locked in two contracts. The first turns it into a long-time commitment (SP), giving one the right to capture a share of network rewards, and weight the staked amount on curation activities (contributing to the protocol). The second turns the locked amount into “stable” currencies pegged to a fixed dollar price (SBD), redeemable in the system’s original unit of account (STEEM) at any given time, and capable of yielding an interest rate.
  • Value accretion: The scheme has been feverishly compared to Ponzis, but I believe that to partly fruit of misunderstanding. Holding the original unit of account has a daily dilution cost, so rational users should either switch to long-term commitments or get exposed to a more predictable interest rate. Both measures put velocity sinks in place for the primary token, pushing its price up, as per the equation of exchange. But wait: where does the value being distributed as network rewards (part of the incentive to lock primary tokens) come from? That’s where most critiques focus on, many assuming the value redistributed comes from new entrants to the market - if growth halts abruptly, there might be no source to pay debts from. If you look closely, this resembles an access-based token model that incorporates a half-baked stablecoin. Worth noting, the solidification of the latter as a standalone category may make this scheme obsolete.

5.3.9. STABLECOINS
Stablecoins have long been hailed as the holy grail of cryptocurrencies. There’s technically three approaches to achieving stability: issuing real-world-asset-pegged tokens (e.g. Tether); issuing crypto-collateralised tokens (e.g. Maker); and algorithmically expanding or contracting the supply of a token according to its usage, which can be referred to as seignorage shares (e.g. Basecoin, or Carbon).

  • Examples: above.
  • Value accretion: in its simplest form, happens as the network grows, and with it, the demand for the stablecoin. Under a fixed supply, prices increase, the system issues new coins to push it down to its target price, and this newly issued coins go to holders of the network’s share-like token (most stablecoins follow a dual-token model). These tokens hence represent future claims on new stablecoins to be issued if network’s usage and demand increases. The inverse is where most of the quirks and flavours of each approach come to place: augmenting the outstanding supply is easy (print money), but reducing it is more complex (see more here).

5.3.10. MEANS OF PAYMENT TOKENS
In earlier days of crypto-frenzy, it was easier to spot around a token designed exclusively to serve as a proprietary means of payment for a given dApp or service. This is diminishing as the advantages of using BTC, ETH or a stablecoin for this purpose consolidate, and the effects of velocity for cryptocurrencies are more widely understood.

It’s important to note these are by no means exclusive categories, but rather patterns that may overlap. For instance, Ocean is clearly a work token, but with TCR-like games embedded in some of its inner mechanisms. Generically, these patterns’ value accretion strategies can be visualised according to the their tokens’ key monetary consequences of increased usage (Y axis, below) and the source of the value they accrue (X axis, below):

7. How to program value accretion in the long-run?

This is obviously an unanswered, and barely comprehended question, yet.

“The most important thing to me is figuring out how big a moat there is around the business. What I love, of course, is a big castle and a big moat with piranhas and crocodiles.” — Warren Buffet

We know this is not how crypto investment thesis are going to sound like, right, Mr. Warren? What should we be looking for, if not piranhas and crocs?

7.1 The governance hypothesis

Chris Burniske and Luke Duncan’s representations for governance as the “non-commoditisable” good of the moment.

In 2007, USV pointed to governance as a basis for differentiation as a next big thing, citing Craiglist’s lightweight community-driven coordination system as an example. Fred Ehsram corroborates the vision when stating that “the value isn’t in the chain of data, it’s in the community and social consensus around a chain. Governance is what keeps communities together and, in turn, gives a token value.”

Both him and Luke Duncan have previously evoked Albert Hirchman’s theory of Exit, Voice, and Loyalty to stipulate that, to maintain network effects (and minimise “Exit”), one must maximise “Voice” of network for participants, through effective governance.

Governance as a Service (GaaS), thus, is a “tokenisation model where the primary utility of a token is governance over the protocol, application, or network it [belongs to], and its value […] accrues relative to the importance of the decisions being made”. Bottom line is that if a token can be easily used to weight on important (economically or politically) decisions, there are increasingly important decisions on the table, and accumulating weight is costly (think quadratic voting), it is likely the token will accrue the value of the decisions it governs over.

7.2 Consider forks (protect, foresee or foster… just don’t ignore)

by Teemu Paivinen, on Thin Protocols

Marginal benefits of scale are traditionally resilient. In the case of blockchains, they not only tend to lose importance in face of interoperability, but may also translate into malices, as the growth of a protocol creates incentives for forks and exposes the network to attack.

Forks are believed to accelerate evolution, just like mutation does for biological systems. This is not inherently bad for end-users, although it can be destructive for investors.

In such perspective, the “fatness” of a protocol may not help its native token accrue value, but rather indicate excessive generality (lack of specialisation), inefficiency or other hidden advantage.

It’s worth noting there’s always friction in switching from a network to its fork, in a miner’s or user’s case. Remember the old saying that “technology is duplicable, but community is not”. And stories like that of Google, which tried to have its own social network at least three times and always failed. Forking may be much more trivial and threatening in blockchains’ context, but tokens backed by strong, effective governance may organically protect, or even coexist with clones of themselves.

How? Well, Monero changes its PoW algorithm periodically to avoid any miner from becoming economically oppressive and capable of conducting an attack (it’s still subject to forks though, like Monero V). Binance, the world’s hottest startup, is developing a decentralised version of its own exchange — namely, a “business fork” (in practice, a new blockchain) that will transparently have lower speeds, higher fees, and will be non-custodial. On the more extreme side of things, some people are even pondering over ideas for “financially unforkable” coins.

7.3 Competitiveness in mining may be more important than user stickiness

Blockchains take multi-sided markets and abstract them into open fields for competition. Just like miners compete for some of the value accrued by a given network (by providing resources and trying to capture some of the network’s value through block rewards), multiple protocols are simultaneously competing for mining power, in all its shapes and flavours.

John Pfeffer helps us frame the counter-intuition here:

When thinking about whether a protocol’s token can capture and sustain economic ‘rent’, what is relevant is whether the mining industry maintaining the protocol’s blockchain is competitive, not the stickiness of users. It is the economic competition amongst miners that will ultimately drive the cost of using the protocol and therefore the value of the token.

7.4 Capturing value <-> Distributing value

As stated previously, the purpose of a business can be seen as that of creating value (through work), selling or trading it to customers, and capturing some of that value as profit. Conversely, one can state that the purpose of a decentralised tokenised network is to foster the creation of value (through distributed work), let it be sold or traded, and allocate the value that would otherwise be captured by a company as profit (via issuance, rewards, penalties and whatsoever).

Evoking John Pfeffer once more:

“the network value of a tokenised version of a dematerialised network business (a social network, Uber, AirBnB, a betting exchange, etc.) will by construction be a small fraction of the enterprise value of its centralised, joint-stock-company equivalent. Holding the number of users constant, you basically take the fully-loaded IT budget (including energy and a capital charge) of those companies (representing PQ) and divide by some (likely high) velocity V. The disruption of traditional networked businesses by decentralised protocol challengers will represent an enormous transfer of utility to users and an enormous destruction of market value”.

This can either mean that investing in cryptoassets is set to become less profitable in the long-run, or that it will undergo some reshaping towards making investors more active in seeking to recapture some of all this value being “lost”.

8. The death of value capture (or “the Red Queen effect”)

Bad news for some.

Astute readers may notice early segments of this text refer to “value capture”, and that “value accretion” begins to be used further. The reason should be evident by now: traditional businesses create value, capture some of it and redistribute profits; tokenised networks, in turn, go directly from value capture to value distribution. Tokens do accrue value, but those who capture (become de facto owners) of it are token holders themselves, non-uniformly. And the mechanisms for accruing value won’t be iterated on by a marketing team laser-focused on testing pricing strategies, but rather encoded by protocols. The simpler they are, the higher the chance of functioning, today. The more complex they are, more important becomes the role of governance as for making multiple parts fitting and evolving well together.

Let’s turn back to the concept of Competitive Advantage Period, formalised by Michael Maubossin and Paul Johnson in 1997. “CAP is when and why competitive advantage fades away […]. Also known as Fade Rate, [it is used] to understand how long an advantage will be relevant, and what kind of value the company will be able to extract during it’s decline”.

SOURCE

In the figure on the side, the Y axis is the return spread while the X axis is time. Competitive forces drive returns down, closer to the cost of capital (return spread = return on invested capital less the cost of capital). The shaded area is what’s interesting to us, since it shows how much value the company can capture before its advantage erodes, and serves as a basis for cash flow multiples and other measures for rate of return.

The CAP for the US stock market, on the 2000s, was estimated to be between 10 and 15 years, individually ranging from anything in between 2 to over 20 years. The CAP for blockchains will always be pushed towards zero.

In resonance with Fred Ehsram’s darwinian line of thinking, the authors of the paper cited above say that

Competitive Advantage is rare and short-lived in the biological world as well. Species are locked in a never-ending coevolutionary arms race with each other.

[…] Biologists refer to such coevolutionary spirals as Red Queen Races, named after the Red Queen in Lewis Carroll’s Through the Looking Glass. It was she who said, “In this place it takes all the running you can do, to keep in the same place.”

There is no such thing as “winning” in a red queen race, just like in cryptoland (bitcoin aside, for some); the best you can ever do is run faster than the competition. To succeed is to survive. No differentiation is permanent, unforkable or un-matchable.

There’s no more moat and piranhas to hide behind. Value capture is not going to show up in the X-ray of a product’s life cycle, but rather be modelled at protocol level. Investors should pursue assets that accrue the most value, and learn how to capture this value themselves. Or, on a deeper level, understand which agents are going to capture this value in new market configurations, and invest directly on them (think Radar Relay, a non-custodial 0x-based exchange with less than a year of activity and that has recently surpassed a million dollars in daily trading volumes).

Just like CPU mining has evolved into more specific strands and eventually even spawned the practice of spec-mining, token investing may be evolving towards active token investing. The Decred bagholder who doesn’t know how to stake is being diluted by inflation day after day. The Livepeer fan who is not delegating towards reputable transcoding nodes is missing on higher returns block after block. There’s even virtual mining pools for stake-based protocols already being formed — definitely a specialised form of investment!

Analysing value accretion mechanisms is an exercise to be done before injecting capital in a new token, but also to be revisited consistently. Open sourced networks can’t hold CAPs long enough to sustain differentiation or value capture in the form of economic rent. But individuals who learn to operate within these networks early on can still capture a bunch of value for themselves. Just don’t expect it to be easy, predictable or passive like joining a cap table once and waiting for something to happen.

9. Go deeper

Below is a list of resources researched during the writing of this text. As you should expect, >90% of the ideas here are not originally mine. I just tried to capture their value 😊

Paratii is an open source R&D lab that develops peer-to-peer technologies for video distribution.

--

--