Alice in Wonderland — The Valuation of Crypto Tokens

Gary Aitchison
32 min readJun 5, 2018

--

“Why, sometimes I’ve believed as many as six impossible things before breakfast.”
― Lewis Carroll, Alice in Wonderland

The market for bitcoin or altcoin tokens has exploded over the last year, both in terms of the number of products and in the total amount of money invested in them. The price trajectory graphs all have that ‘bubbly’ look to them — an exponential rise, followed by a crash, and then a random walk of mini bubbles and mini crashes.

Bitcoin Price in USD — source: Charts Bitcoin.com

The market capitalization for tokens has also exploded as shown in the picture below:

Source: Coinmarketcap.com

However, lets not get carried away. This is still a drop in the ocean!:

Relative Market Size of Monetary Instruments: Update to 2017 from Source MarketWatch.com

This influx of speculative investment has lead to a plethora of articles and discussions about a ‘real valuation’ methodology for crypto tokens — one that is divorced from the “bigger fool” predicate that underpins the crowd behavior of bubbles. {For an interesting economic view see Stiglitz & Shapiro 1998: ‘Simulating the Madness of Crowds: Price Bubbles in an Auction-Mediated Robot Market’ }.

In this article I will explore the main valuation theses, and comment on their logical consistency and theoretical underpinning, in an attempt to understand two things:

  1. Is there a logically consistent methodology that could suggest a rationale ‘floor price’ that should prevail after the bubble froth has subsided? and
  2. What belief systems and theories {no matter how misguided or lacking in verification} are driving rational investors and hedge funds to invest significant amounts of monies in crypto tokens {over and above speculative and/or manipulative market makers}?

Types of Tokens

Not all crypto tokens are the same, and before we jump into valuation methodologies it is important to first create some distinctions. In this I am indebted to some excellent work done by Thomas Euler (and colleagues) in January 2018, who created a complete classification structure: “The Token Classification Framework: A multi-dimensional tool for understanding and classifying crypto tokens”.

Their very useful 5-dimensional breakdown is shown in the graphic below (a high resolution version is available from the link above):

5-Dimensional Classification of Tokens. Source Thomas Euler, Untitled Inc.

By considering the various patterns amongst existing tokens, Thomas et al derived four basic archetypes for tokens, and we will use them in this article:

I will deal with these in increasing order of difficulty, in terms of valuation methodologies.

With a tokenized asset the token represents itself as a form of claim against an underlying asset, such as gold or US Dollars. Typically some repository exists which has holdings in the underlying asset, and the tokens represent a claim against this repository. In this way ownership of the underlying asset can be easily traded by trading tokens. For people familiar with the history of financial instruments, this is very familiar territory, where repositories were created for all sorts of instruments (including gold and bearer bonds) and trading clubs executed contracts (tokens) against these — creating a high velocity trading market where the asset never moves.

At face value, the valuation of a crypto token that represents an underlying asset should be easy — it is the value of the asset. However the token is not the same as the asset due to various forms of risk, including settlement risk. This can be seen in two images below. The first is for Tether, a token that represents USD:

Tether Exchange Rate: Source: Coinbase

The second is for Digix Gold (DGX), a token representing 1 gram of gold. The graph below shows the recent history of DGX against a troy ounce (XAU) where the exact conversion should be 0.0321507:

Price of DGX against Troy Ounce: Source: Coingecko.com

More generally there are crypto tokens that have taken asset backing to a more general structure, the best example being Ripple IOU tokens. {These should not be confused with Ripple XRP}. In this case a Ripple IOU token represents a redemption contract with the issuer of the IOU, who promises to redeem on demand whatever asset (or sets of assets) that are defined in the IOU. In this case the recipient of the IOU must value the redemption risk of the issuer, and evaluate their trust in the issuer of the IOU, and modify their view of the asset value accordingly. This has echoes of the old ‘bearer bonds’ in financial markets, but it can also be seen as the mechanics behind ordinary bank accounts.

There are two take home messages from this discussion and the graphs above:

  1. Asset backed tokens have increased volatility risk and potentially have increased settlement risks, which can extend both the systemic and the systematic risk of their underlying assets (for a variety of reasons); and
  2. There may be small premiums associated with the token due to additional services (such as ease of liquidity or exchange). For example the DXG token trades at around a 2.5% premium to gold.

What is not so apparent is that additional systemic risk associated with new systems, especially decentralized systems, can be significant — despite the hype of the blockchain as being super secure. These can be the result of fraud, malicious attacks, or some combination thereof, and they can cascade within the systems so that they are difficult to diversify or hedge against.

It is worth noting that distributed crypto token systems are vulnerable to a 51% attack (see this recent article regarding a spate of such attacks and further this article explains briefly what is involved). Such attacks create a systemic risk that a significant portion of the transactions could be in error for a period of time.

The standard way to value an asset is via the Capital Asset Pricing Model, that creates a calculation of the return on an asset. The CAPM model (in all its variants) captures the idea that the value of an asset is not just its future expected returns, but exactly how these returns and the risks associated with those returns interact with other elements of a portfolio of assets. Implicitly the model reflects the fact that asset prices are determined by supply and demand across a portfolio, and not by any one specific and isolated evaluation of an asset itself.

In the case of our asset backed tokens, we can abstract from the core asset pricing problems as that has been implicitly solved already, and simply focus on the variances introduced through additional risks and additional transactional benefits of the token system.

{For a relatively recent and thorough analysis of risk in asset pricing, see “Systemic Risk; 2013”}

The ‘King-of- Hearts’ Valuation Metric

“If there is no meaning in it,” said the King, “that saves a world of trouble, you know, as we needn’t try to find any. And yet I don’t know.”
― Lewis Carroll, Alice in Wonderland

For simplicity I propose a valuation metric that I tongue in cheek call the “King-of-Hearts” metric, as in essence it follows the advice of the King in Alice in Wonderland — because we are defining away the valuation problem by loosely saying that of course it is defined by the value of the reference asset, and voila we leave the problem of valuing the reference asset as a problem out of scope.

King-of-Hearts Metric

In the King-of-Hearts metric the value of the token closely tracks the price and value of the underlying asset. However it is modified by three factors:

The Alpha Utility Factor represents a marginal benefit over and above the reference token based on the operation of the token network. For example, the token network may offer more rapid transfers, or more liquid transfers (due to divisibility), or more anonymous transfers and so on. Whatever the cause, the market may rationally assess that the token has a premium as a consequence of this.

The Beta Volatility Factor represents a negative view of the increased volatility of the token verses the asset group. This can be measured historically as proportional to a standard deviation measure, but can be modified by considering the covariance of the volatility verses a a portfolio of other assets.

The Gamma Risk Factor represents a negative view of the increased systemic risk associated with the network — in terms of settlement risk (the possibility that the asset backing may not be real or complete) or fraud risk (the transactions that precede are not true transactions).

King-of-Hearts provides a framework for analyzing historical data to quantify alpha, beta and gamma and it also provides a framework for discussing the price relationship.

An interesting case in point is the Tether token and framework. As shown in the graph above Tether is tied to the USD and despite some volatility it trades at approximate equivalence.

Yet there is significant concern over its King-of-Heart’s Gamma — this cannot be zero and in any rational view could be significant. There is concern in the market that the touted one-for-one backing may not be true — see “Why experts are worried about Tether, a dollar-pegged cryptocurrency

In addition, the King-of-Heart’s Beta is non zero — as the graph above shows the SD of the previous 12 months is non zero, but as reported by DigitalAssetDb it is very low (<1%).

To trade at an exact USD equivalence as has been occuring recently, the benefits of Tether in terms of its King-of-Heart’s Alpha must outweigh these negatives. Tether purports to provide more rapid and increased liquidity, and these positives must be valued by the market to outweigh the negative margins of its Beta and Gamma. But as the King says, “and yet I don’t know” and perhaps the market is ill-informed and convinced by the ‘equivalence’ rhetoric of Tether marketing. Whatever the case, the above analysis suggests that there is something very very strange happening in the Tether market. Unless it is being swamped by new issues and forcing the 1:1 ratio, there is no logical way the graph should look like that. Caveat emptor as they say.

Token-as-a-Share is the next archetype to consider, and in a legal sense they are broadly ‘security tokens’. This has important legal ramifications, because such tokens then fall under the province of the relevant securities regulator — which in the USA is the SEC.

The key definition in the USA for whether a token is a security is the Howey Test. The crucial aspect of the test is whether or not cryptocurrency investors who purchase tokens are participating in a speculative common enterprise, and if so, if the profits those investors are hoping for are entirely dependent upon the work of a third party.

Any form of equity token where the tokens represent participation in the future returns of the entity as a consequence of the investment are usually classified as a security.

This includes tokens that act as shares, or shadow shares, or units in a trust or partnership — anything that provides access to a revenue stream as a right, whether or not such a right carries other privileges (such as voting or control).

Some examples are:

Lykke issued tokens that gave the token holders non-voting equity rights in a portion of the shareholding of a Swiss Corporation that has launched an exchange for trading tokens that represent securities. The holders of the Lykke token receive rights to the future dividend stream from the operation of the platform.

Therefore Token-as-a-Share tokens are valued in the same way that a venture investor would value a startup or mezzanine investment, or how a crowd funded equity issue would be evaluated by the crowd fund investor.

Often these types of tokens are mixed with other types, with complex related rights.

DigixDAO is an example of an interlinking between two tokens. The DGD tokens provide a security right via an equity style token in the Digix DAO organization, whilst the DGX tokens are asset backed (gold) tokens issued by the DAO.

The valuation of a Token-as-a-Share requires the valuation of the future value and profit/dividend stream from the underlying organization or entity. This can be derived in a bewildering array of methods, which all attempt to reach the same end — a discounted view of the likely exit (or mature revenue stream) at the end of the path. In keeping with the theme, I have coined this the:

Cheshire-CatValuation Metric

“Would you tell me, please, which way I ought to go from here?’
‘That depends a good deal on where you want to get to,’ said the Cat.
‘I don’t much care where -’ said Alice.
‘Then it doesn’t matter which way you go,’ said the Cat.
‘- so long as I get SOMEWHERE,’ Alice added as an explanation.
‘Oh, you’re sure to do that,’ said the Cat, ‘if you only walk long enough.”
― Lewis Carroll, Alice in Wonderland

Cheshire-Cat Valuation Method for Token-as-a-Share

Although it appears somewhat ambiguous, the methods for valuing equity tokens or token-as-a-share systems tend to fall within certain bounds and conform to simple rules, which although they are plain common sense, in the fanciful wonderland of crypto, it is worth restating them.

  1. The scale of returns is relatively well understood by practitioners, at least within orders of magnitude, and so the more tokens that are issued the less the value of each token for a given issue (that is, there is no magical exponential graph that can appear from unspecified externalities).
  2. The trajectory for returns looks like the trajectories that are seen in stock market floats or in VC portfolios. There are a lot of starters, a lot fail, the ones that succeed tend to go on to dominate a category with a few smaller competitors, in a positive market there is initial overshoot of expectations and a rapid return to earth, and the values tend to over-react to both good and bad news.
Classic Market Curve for Startup Trajectory

For a good overview (though there are literally thousands of articles on the wizardry of early startup valuation processes) see “Valuation For Startups — 9 Methods Explained”, by Stephane Nasser, June 2016

Tokenized Platforms are the largest new group in the crypto token world, and their tokens are often referred to as “utility tokens”. In a tokenized platform the purchaser of the token is granted certain rights to participate in the operation of the platform or to consume the services that the platform offers, or to provide services (such as work) to the platform.

There are many ways to do this, but the general idea is that users buy tokens and then pay these tokens to other participants in the platform who perform some form of service. Sometimes the platform owners take a portion of the fee, or they burn the tokens (send them to a dead address).

In some cases the network requires service providers to stake a certain amount of tokens in order to provide a service. This is usually called a work token system, but the term is used confusingly.

In the work token model, the service provider stake is essentially a bond which is returned if work conforms to certain standards or rules, and the work in the network can be allocated on the basis of how much has been staked.

Examples include:

  • Filecoin (distributed file storage),
  • Livepeer (distributed video encoding),
  • Truebit (off-chain verifiable computation), and
  • Ethereum (smart contracts) {supposedly moving to proof-of-stake in near future}

One additional variant is called the “burn and mint” system or BME. In the BME model, unlike the work token model, tokens are a proprietary payment currency. But unlike traditional proprietary payment currencies, users who want to use a service do not directly pay a counterparty to use the service. Rather, users burn tokens in the name of the service provider. Factom is an example of this system.

The obvious way to value tokenized platforms is via NPV calculations derived from the the expected future revenue streams — either for service providers if the platform passes through all fees, or some combination of platform and service providers if the fees are split.

However we are in Wonderland with crypto, and so many platforms are analyzed in terms of either some variant of the ‘value of money’ idea or some variant of the ‘network effect’ idea. I will discuss both in some detail as they are illuminating.

The Mad-Hatter-Fallacy

“If I had a world of my own, everything would be nonsense. Nothing would be what it is, because everything would be what it isn’t. And contrary wise, what is, it wouldn’t be. And what it wouldn’t be, it would. You see?”
― Lewis Carroll, Alice’s Adventures in Wonderland & Through the Looking-Glass

The fallacy I am about to enumerate is so commonplace in the writings on crypto economics that it is worth its own name, the Mad-Hatter-Fallacy. To explain it clearly I will posit a simple tokenized platform called the “Mad Hatter’s Tea Party”. The arguments are subtle but far reaching, so bear with me!

The party consists of the following elements (and I am mimicking almost exactly the calculations done by Chris Burniske in his book and paper on the subject):

  1. A set of tables, and around each table there are seats. Participants can initially obtain plastic chips by getting them from a dispenser on the table by paying dollars (fiat currency). This is our tokenized platform ecosystem, with plastic chips(crypto tokens), tables (tokenized platform) and initial plastic chips (ICO).
  2. A set of 10 chairs is at each table, with participants to the tea party sitting in each chair. Each participant has a name (crypto address) and everyone has a set of napkins with one name on each napkin corresponding to all participants around each table (public keys and message envelope).
  3. Some of the participants are tea creators, and produce a very special tea from their tea machines, which can be obtained by presenting plastic chips for a cup of tea ( a tokenized service). We know that in the world “outside” there is a market value for this amazing tea of $1k per cup, but we also know that each tea creator has costs that vary between $100 and $800 per cup. {note carefully the externality in the market}.
  4. Each participant can be a tea consumer, and consumes tea by presenting plastic chips to the tea creator by sending a message on a napkin designated to the specific creator, with their chips and the number of cups they want. If they need more chips they can buy them from the tea creators, after the initial allocation.
  5. Every time tea is provided, a record of this appears on the shared tablecloth as if by magic (a block transaction on the blockchain).
  6. The rules of the tea party are simple. Participants can shout out bids to the table for cups of tea, offering chips. Tea creators or other participants who have tea can accept such offers, and if they do, the magic tablecloth records it, and the consumer sends the napkin with the chips and receives a cup of tea in return. Participants must pass the napkin along until it reaches the intended recipient.
  7. Participants can also shout out offers to buy chips from tea creators or other participants, using fiat currency.
  8. Everyone at each table is constantly shouting out bids and offers in an air of high excitement, and napkins and chips and cups of tea are being shuffled around continuously. The higher the excitement, the more frenzied the passing becomes, and even tables with few initial participants gradually become full of excited party goers.
  9. Each party lasts one hour

Our external crypto economist observes the madness and considers joining an empty table. There is a very attractive spruiker at this table, promising that it will soon be as mad as all the others. She is faced with two dilemmas:

  1. How much to pay for the plastic chips and how many to buy?; and
  2. What is the value of magic tablecloth (because as an economist she is always interested in purchasing systems)?

She turns to the second question first (it is after all, a mad hatter’s tea party) and is armed with the monetary value theory of crypto-currencies. In this theory (crudely derived from a somewhat outdated theory of money), she knows of a formula! The thinking goes like this (from “Cryptoasset Valuations’, by Chris Burniske):

— — — — — — — — —

The equation of exchange is MV = PQ, and when applied to crypto the interpretation is:

M = size of the asset base
V = velocity of the asset
P = price of the digital resource being provisioned
Q = quantity of the digital resource being provisioned

So if I find M, and divide by the number of tokens in circulation, then voila, I have price/value per token!

— — — — — — — — — —

So all she has to do is plug this into the tea party. Table Four with the White Rabbit in attendance is a good one to look at. She observes that:

  1. At table four she knows that there are 1,000 plastic chips, and no more will be created
  2. She observes that at a recent table very similar to this one that the tea creators created 6,000 cups of tea in an hour, as evidenced by their tablecloth blockchain. So Q = 6,000
  3. After watching the table, she can see that a tea transaction happens every second (because participants can onsell their cups of tea). So v=3,600
  4. She knows that a cup of tea is ‘worth’ $1k
  5. So now she can work out M! It is M=PQ/V.

So M = 1,666 and dividing by the number of tokens (1,000) means each token is “worth” $0.66. This is the mimic of the calculations in Burniske and many others.

Voila. So who caught the Mad Hatter’s slight of hand?

Well tea is worth $1k per cup isn't it? If we had 3,600 transactions and we delivered 6,000 cups of tea then of course we exchanged 1.666 cups of tea in each transaction and each transaction was worth $1,6666. But how many tokens were used in each transaction? Do the math as they say. Imagine if one participant had all the tokens and it was serialized one step at a time and they bid from one tea maker at a time. Then each token would be worth $0.66. (Value of transactions divided by number of tokens). Mmmm. What about if two participants had equal shares of tokens? And they asked two tea creators in parallel — well because we have to maintain the overall throughput at 3,600 now each tea creator works at half the speed and is paid half the tokens … and we still get $0.66 per token. And so on ….

Head spinning yet? What is going on?

This is fundamental mathematical philosophy 101. What you are observing is a tautology and you are seeking to gain explanatory power from it. Sorry, doesn't happen.

As has been well understood for a long while MV = PQ is an accounting tautology that has almost zero predictive power. It doesn't explain anything. Why? Because it should be written

What this means is that rather than being a set of independent variables that have predictive causality, in fact the variables are simply functions of each other and the equation is simply a rearrangement of symbols.

Astute readers should recognize this as the “fireman’s dilemma” — we observe that when we plot the number of fire engines discovered at a fire we find that there is usually two, and we form the expression 2*#Fire Engines = #Fires, and then the naive reader concludes that Fire Engines must cause fires, because as we increase the number of fire engines we should find there are more fires. The truth is in the causality — which we write in mathematics as a one way functional mapping from fires to fire_engines. No algebra allowed!

To restate the Mad-Hatters-Fallacy:

“Using the Quantity Theory of Money relationship of MV=PQ to calculate the value of a crypto network with a utility token has zero predictive power, as the entire calculation is driven by exogenous assumptions about the fiat price of the network service.”

I also recommend here some excellent work by Austere Capital — see “MV=P…Que? Love and Circularity in the Time of Crypto”, by Anshuman Mehta and Brian Koralewski, February 2018

Our crypto economist now turns to the first question (after determining that the price of a plastic chip appears to be deterministic to the exogenously defined price of a cup of tea, and accepting the Mad-Hatters-Fallacy). What is the value of the tablecloth and table and the chips irrespective of the price of tea? That is, what is the actual value of the crypto network that is facilitating this exchange?

Observing the chaos around her, she sees that participants are leaving some tables and joining others, adding to the confusion. She corners one such participant and asks why. The White Rabbit responds: “Well at this table their chips don’t cost quite so much, and I overheard them shouting the price.”

The (somewhat bleedingly) obvious point is that either there is competition in the supply of network platforms or there isn't.

If there is competition, then the incremental price/value of the utility token over and above its connected reference to the price/value of the service will be competed away so the price equals the marginal cost (with allowance for product differentiation, location costs and all the other micro-economic variables that allow for price differentiation).

If there is no other way to get the service except through one unique platform (a rather exceptional and rare case) or if someone buys up all the platforms (a more common case), then the platform (or effectively the service providers) will charge the monopoly price for the service.

Alice’s Valuation Rule for Utility Networks

“Fundamentally a utility network is priced and valued the same as a token-as-an-asset network {where the asset is a stream of services) plus a consideration for the special utility of the network function. If there are low or no barriers to entry in creating blockchain platforms, then the incremental price of a utility token against its external reference for similar utility is the marginal cost of providing the platform.

If there are significant barriers to entry such that monopolistic behavior can be sustained, then the incremental price of a utility token against its external utility reference is higher than the marginal cost of the platform by a ratio of the price elasticity of demand for its utility {e*MC/(1+e)}”

The key point here is that this is standard economics — no magic required — and there is no exponential or magical forward looking curve to puzzle over. It should be expected that the marginal utility and cost of providing these services is relatively low, and so they may command a small premium over the provision of similar non decentralized services. As should be expected.

It is just at this calm point that the March Hare rushes in, and excitedly points to a very large table with a huge number of participants, and wildly exclaims “but haven’t you heard about network effects and Metcalfe’s Law of Network Value? By looking at these small tables you have totally missed the point! People are buying chips because they represent future network effects and they are magically capturing the future!!”

There are many articles pushing this line. A small sample includes:

Valuing Bitcoin and Ethereum with Metcalfe’s Law” by clearblocks, February 2018 (Medium)

Digital blockchain networks appear to be following Metcalfe’s Law” by Ken Alabi, August 2017

Using Metcalfe’s Law to value cryptocurrencies”, April 2018

How network theory predicts the value of Bitcoin”, Emerging Technology, March 2018

So lets look carefully and critically at the so-called Metcalfe’s Law and see if it sheds any light on the value of large decentralized collections of actors in the blockchain.

The general idea of Metcalfe’s Law is that the ‘value’ of any network of interconnected things should be proportional to n(n-1) which approaches n-squared as the network gets large. The concept behind it is that any one connection can make (n-1) connections to others.

As Metcalfe himself pointed out, this is not really a law but actually an observation:

“The original point of my law was to establish the existence of a cost-value crossover point — critical mass — before which networks don’t pay. The trick is to get past that point, to establish critical mass.”

So firstly a clear warning — we are dealing with possible correlations and not necessarily causation, and we are dealing with suggestions about correlations regarding differences in value, not absolute numbers.

The second note of caution is that empirically the “law” has been proven wrong on many occasions. See “Metcalfe’s Law is Wrong”, Bob Briscoe et al, 2006, IEEE as an example. The real point Metcalfe was making was that the value of networks tend to go up faster than linearly and yet costs are linear, and so there is a cross over point. Interestingly other authors have concluded that in many cases a nlog(n) relationship fits the data better than n-squared.

For interest sake I have included a plot below of the Bitcoin market cap against the nlog(n) of the unique addresses in the market. {although Bitcoin is not really a network token, it does have a large data set!}. There are two sections — the first which is steep with a backward bending section, and then a low gradient positive correlation:

Market Cap BitCoin vs nlog(n) Unique Addresses

And then again with the time series data for both an nlog(n) relationship and an n-squared relationship, showing that if only we could ignore the first thousand data points we are cool to go:

The third note of caution is the classic question — what if the price or the changes in price drive more users to the network? And what if when the price is flat or falling people lose interest in the network? That is, what if causality went the other way?

A behavioural economics view could see the network as being a speculative asset whose future is very uncertain and only vaguely understood, and participants enter the network ecosystem based on price signals and leave based on price signals. A simple model would have the rate of entry depend on the rate of change of price, adjusted for volatility. The higher the rate of change of price the higher the rate of entry. The higher the volatility the slower the rate of entry. Make volatility increase with price, make price have a positive feedback loop to lagged number of users, add in some exogenous random noise into price expectations and this model will produce a rapidly growing user base that gets unstable after an exponential rise and then collapses with instability. Then the whole Metcalfe Law scenario would merely be a reflection of the correlation and the fact that the two variables were not independent. I am not suggesting that this is the case or even that this is a realistic model, but I am suggesting that blind consumption of correlations is never a good theory, and even a cursory look at the above graphs and a tilt towards my behavioral economics model would cause one to approach with caution.

There is yet one more important note of caution, and this is a result of delicately picking apart what we mean exactly by “value” when we have externalities or public goods. What the proponents of the “network effect” are essentially saying is that each participant in an ecosystem confers a small positive ‘value’ to all other participants by the simple virtue of their presence. This small contribution is an externality of their choice to participate, and it is a public good because it is has the characteristics of non-rivalry (consumption by one does not impact consumption by others) and non-excludability (at least within the network itself).

The proponents of the network value theory of crypto tokens essentially are suggesting that by charging access to the network, this public good value chain can be captured by having a token mechanism. It is here that we need to be very careful. Although token schemes are very varied, in some form the tokens are typically used to pay for a private service — a service that has rivalry and excludability — but that on top of this private good there is some unquantifiable public good associated with the bringing together of a large number of players. This could be features of the network itself (the verifiable proof, the redundancy and so on) but even if these exist, it is hard to imagine why they would increase beyond a certain point. There is only so much redundancy to be had. And if the idea is that marginal costs are continually decreasing (natural monopoly) then again it is hard to square this with the idea of a decentralized network — if it is decentralized then the marginal costs of expansion will be fixed or increasing, not decreasing. And lastly we have the pricing problem. Even assuming that there is a public good aspect, the attribution to each individual is very small, but is conferred across all, and the large numbers produce a large impact to all — but if we charge for the access and try to capture it then the user can withdraw.

A small Alice in Wonderland picnic example.

There is a wonderful picnic in Wonderland hosted by the March Hare — each participant is asked to bring a slice of cake to the picnic. The picnic table has a magical character, in that when a new piece of cake is placed upon it by a new participant, all the other participants at the picnic magically see a piece of cake appear on their plates. The picnic is proceeding apace, with newcomers arriving and all picnic goers revelling in the abundance of cake. The March Hare has an idea — surely with all this cake he can charge a gold coin for participants to come to the picnic? He starts his new scheme, waiting for the gold to pour in and is astonished to find that no-one comes. Each participant evaluates the gold coin against their one piece of cake, and finding it too much, withdraws, because their participation is elastically substitutable for other activities— so the March Hare cannot capture the public good aspect of his picnic unless he has an inelastic commodity to tax . This is a well understood issue in public economics that is too complex to go into here.

{As an aside, there is a small start to understanding the econometrics of willingness to pay for ‘connectedness’. A small survey in the USA tried to determine what people would pay for access to Facebook. Most said they wouldn’t pay, but of those who would, close to 50% said “somewhere between $1 and $5 per month”, and this includes the utility value of photo storage etc.}

The March-Hare-Fallacy of Network Value

“Take some more tea,” the March Hare said to Alice, very earnestly.
“I’ve had nothing yet,” Alice replied in an offended tone, “so I can’t take more.”
“You mean you can’t take less,” said the Hatter: “it’s very easy to take more than nothing.”
“Nobody asked your opinion,” said Alice.”
Lewis Carroll, Alice in Wonderland

“Network Value effects, if they exist at all, are irrelevant to the valuation and pricing of crypto tokens in a network platform model. If they do exist, it is very uncertain which way causality proceeds. If they do exist, the act of pricing for them by issuing tokens with a premium is almost certain to eliminate the very participation that creates the effect. They can be ignored for all intents and purposes.”

Lastly we turn to crypto tokens as a currency, and although it is the most complex, we have already met most of the features and tools that we need in Wonderland.

Crypto currency tokens are proposed as an alternative to fiat currency in one way or another, or more loosely “money”. Historically, money has to have the following general characteristics:

  • Durable: it must not be perishable or easily destroyed.
  • Portable: it must be easy to transport and store, making it possible to secure it against loss or theft and allowing it to facilitate long-distance trade.
  • Fungible: it should be easily interchangeable with another of equal quantity.
  • Verifiable: it must be easy to verify it as authentic.
  • Divisible: it must be easy to subdivide.
  • Scarce: It must be scarce or very costly to make.
  • Acceptability: It must be widely accepted, and over a long period of time
  • Stability: It must be easy to understand the current value, which should be stable

Two further criteria occur in the modern age:

  • Censorship Resistant. Added by crypto enthusiasts who argue that physical cash has this characteristic, and it is defined as whether some external authority can control use of it.
  • Liquidity Control: In a modern economy, where most money is actually digital, probably the most critical role of money is as an instrument of intertemporal liquidity where changes in the exchange (interest rates) are used to modify the fundamental disequilibrium economics of modern economies.

I don’t intend to get into debate in this article about the merits or otherwise of these criteria as they relate to different crypto tokens, but in general it is accepted that for Bitcoin at least there is reasonable agreement that it currently satisfies (at least partially) durability, portability, fungibility, verifiability, scarcity and censorship. It would be a pretty big stretch to argue that it satisfies acceptability, stability or liquidity.

However, given that it is being used as a form of money and a form of exchange, and that maybe in the future it can address at least some of the failings, then we can consider how to value such tokens and the network that conducts the exchange.

There are several ways in which crypto money tokens are being valued. The first and most common is that you can determine the value by looking at the aggregate remittance value being transferred across the network. The oft quoted statement is taken as fact: “If a cryptocurrency is a medium of exchange, then it should be priced according to the amount of value its users are transacting.”

Unfortunately this is completely false.

I am indebted to Anshuman Mehta for a clear expose in the following article, which I highly recommend:

Debunking Bitcoin’s Remittance Valuation. Featuring a Lead Pipe” by Anshuman Mehta, February 2018 (Medium)

In keeping with the theme I will restate the example in a modification of the Mad-Hatter’s-Fallacy tea party:

  1. Imagine the same tea party as before, but instead of drinking tea (consuming a service) the tea party participants are exchanging very valuable slices of gold or silver cake on paper plates (exchanging units of value).
  2. It is not possible to see the underlying cakes from which participants are drawing their slices.
  3. Participants can only transfer slices using special paper plates (tokens)that they can get buy from a dispenser on the table, or buy in advance (ICO).
  4. Participants write addresses on a napkin, together with their exchange rate, and pass the plate and napkin and slices around the table until they reach their designated recipient, who responds with the return exchange (smart contract).
  5. On conclusion of the transaction, the tablecloth magically records the entire transaction (blockchain ledger).
  6. The tea party consists of participants shouting out deals, agreeing with a counterparty and then exchanging slices of silver or gold cake on paper plates, in a tumultuous decentralized environment.

What should the price of the paper plates be(token)? And what is the value of the tea party table system at play? (Token Platform)

It should be bleedingly obvious that the millions of dollars being exchanged on paper plates does not in any way affect the value or price of the paper plates. I can also inform you that as someone who has built an interbank settlement system that transfers hundreds of billions of dollars a night, that the ‘value’ and ‘cost’ of that system was not hundreds of billions (as otherwise I wouldn’t be writing articles like this!).

So why do so many crypto economists accept such a simple and obvious fallacy?

The thought experiment is useful. Why would paper plates become very very expensive?

Well, what if this tea party was the only one in town? That would mean that if I had a gold cake and I wanted a slice of silver cake, then the only way I could make an exchange would be to buy a paper plate. Then the provider of the paper plate could extort me up to the limit of my marginal utility of the exchange. Then the plates would be valuable. Not as valuable as the slices being transferred, but still much more valuable than a paper plate.

So what does this say? It suggests the obvious yet again.

Assuming there is differentiated competition between platforms or other alternatives (which seems a reasonable assumption) then the value of a token on the crypto money platform will be its marginal cost of delivery plus the usual product differentiation margins that can exist in markets. That is, it will be close to the marginal cost of creating tokens (provided of course that this is less than or equal to the marginal utility of doing exchanges on the platform!). In proof-of-work systems this cost is somewhat high, whereas in proof-of-stake it is probably very low.

At this point the March Hare rushes by again muttering about network effects and monetary theory of value equations, and we affirm that the exact same fallacy arguments apply here as they did in the network token case.

This discussion would not be complete without raising the issue of crypto-tokens that act as a crypto currency in being a store of value,and on cue the Duchess enters the scene:

“Of course it is,’ said the Duchess, who seemed ready to agree to everything
that Alice said; ‘there’s a large mustard-mine near here. And the moral
of that is– “The more there is of mine, the less there is of yours.”
― Lewis Carroll, Alice in Wonderland

If we ignore the speculative returns, and focus on durability, many have argued that Bitcoin acts as a store of value, and if not right now, then sometime soon.

First what is a store of value?

“A store of value is the function of an asset that can be saved, retrieved and exchanged at a later time, and be predictably useful when retrieved. … The most common store of value in modern times has been money, currency, or a commodity like a precious metal or financial capital.” Wikipedia

There are two key problems for crypto currencies as a store of value:

  1. They are too volatile; and
  2. They are not negatively correlated with other asset classes

The first is required to ensure ‘predictable usefulness’ — if you are uncertain whether the asset is going to be worth 10% more or 10% less, it is by definition a poor store of value unless ….

It is negatively correlated with other asset returns, and then it can form part of a diversified portfolio strategy.

The problem for crypto currencies is that they are not negatively correlated — no-one has been able to show a statistically significant and sustained pattern of correlations that would warrant their inclusion in a portfolio.

Below is an example correlation matrix showing Bitcoin against other crypto currencies and also against the S&P and gold:

Correlation Matrix: From Sifr Data

VIX is the only candidate (the Chicago Board Options Volatility Index), and as has been discussed in research this does’nt hold for all periods (see “So Is There a Correlation Between Bitcoin and Stock Market? Yes, But No” by Darryn Pollack, February 2018).

So a rational investor would question why it would be used as a store of value. However, if you live in an extremely volatile country (Zimbabwe or Venezuela come to mind) perhaps this is one of the few censorship resistant hedges available to you. And besides, as the Duchess would advise, it may be slightly irrational but still a good thing to own a bit of it ‘just because it is there’ and be comfortable that your asset portfolio is complete — just in case.

SUMMARY

In this (much longer than expected) article I have sought to discuss and review the various methods of valuation of different types of crypto tokens by considering a small set of archetype crypto platforms, and in doing so seek to disentangle myths and fallacies from sound analysis.

The simple conclusion is that, contrary to the supposition that “crypto economics” is new and requires new methods of analysis, the truth is more mundane.

In a connected world with competition, it turns out there is no magic in valuing crypto tokens. They behave in the same way as any other product or service. Typically they can be valued by reference to their reference asset or their reference service, with small margins for the ‘special utility’ offered and with small (or large) discounts for the additional risks and volatility. If the value stream is offered over time, they can be valued by discounting this to the present, using traditional means, again adjusted for risk. And when considered as a portfolio asset, they form part of a risk returns portfolio that adjusts the holding quantities based on the characteristics of the asset verses the whole portfolio.

The extreme volatility and exponential rise and fall, is most likely a response to speculative bubble phenomena that are well understood (but very difficult to predict) and which are the result of forward expectations that are not based on any rational assessment. In general terms they are based on trading activity and the usual “greater fool” aspects of speculative bubbles and crashes.

Despite the above, I consider the concept of decentralized trust systems to be an exciting and interesting development that will have profound implications. But they will be the result of hard work, and gradual improvements in performance and utility, not magical explosions of value.

One last aside, is that for crypto currencies to realise their potential the analysis here suggests that liquidity processes are critical, especially intertemporal ones. Unfortunately they can only flourish on the back of stability, so the damping of speculative behavior will be a precursor to unlocking that potential, not a driver of it.

Other Useful Links and References

On Token Value” by Nick Tomaino, August 2017 (Medium)

Today’s Crypto Asset Valuation Frameworks” by Ashley Lannquist, March 2018 (Medium)

Cryptoasset Valuations” by Chris Burniske, September 2017 (Medium)

Simulating the Madness of Crowds: Price Bubbles in
an Auction-Mediated Robot Marke
t”, Steiglitz, K. & Shapiro, D. Computational Economics (1998) 12: 35.

Cryptoeconomics Is Hard”, by Aleksandr Bulkin, January 2017 (Medium)

Tokenomics — A Business Guide to Token Usage, Utility and Value”, by William Mougayar, June 2017

Valuation Models for Cryptocurrencies”, by Phil Glazer, February 2018 (Medium)

An Exploration of Cryptocurrency Investing, Modern Portfolio Theory, and Portfolio Construction”, by Phil Glazer, April 2018 (Medium)

Context, Inputs, and Techniques for Valuing Cryptocurrencies”, by Phil Glazer, February 2018 (Medium)

Equity Token Finance”, by David Siegel, November 2017 (Medium)

Improving Network Incentives Through Work Tokens”, by Patrick Mayr, March 2018 (Medium)

New Models For Utility Tokens”, by Kyle Samani, February 2018

--

--

Gary Aitchison

Serial entrepreneur, Engineer & Economist, citizen of world. BSc, BEng & MTCP Sydney University, MSc Economics London School of Economics & Political Science.