Valuation Crisis and Crypto Economy
Bringing fundamental value to economy through cryptotokens
Dick Bryan, Chief Economist, Economic Space Agency
Long-term token values will be determined by three factors:
- The current valuation of a token business
- Projections of the future positions of that business
- The level of ‘speculation’ on crypto assets and financial assets generally
We know that the third factor is impossible to explain. It depends on what Keynes called ‘animal spirits’. It is likely that speculation will remain high, for this is an immature market with rapid new entry in a sector of the capital market with widely-appreciated potential but low levels of technical comprehension. It makes valuation difficult, but it is unavoidable.
The second factor is also impossible to know with certainty. We know that the emerging landscape of cryptographically enabled distributed economic and social systems is changing exponentially, and that there will be some extraordinary success stories and some dismal failures. But we cannot yet know which is which. There is a proposition that, in this dimension, tokens be equated with call options: the right to participate in/own an as yet unknown future. It is an important insight, although our capacity to value the option is also limited. Critical to pricing options is quantifying the volatility of the underlying asset and the time to maturity. We cannot yet model volatility, and we cannot know the time horizons of success.
But the first factor can potentially be known for crypto tokens that exist to produce something: an accountable, measurable value of some kind. This is an area where many token launches are vague, if not entirely silent. But it is the one domain in which current tokens can and should be critically valued. We face the danger that too many coin issuances are pitching an idea, but without nominating means to measure/validate its success.
A thought experiment: the ECSA economy proposes to define and measure value as it has never before been measured. We want to measure value in terms of social contribution, not contribution only to profit. We want to measure produced social benefits, and most of them do not make conventionally-defined profits. Recognising the existence of these contributions is not new. Measuring them in terms of a new unit of account is new. But is it possible?
The measurement process is important if we would like to report to the public, each year, estimates of the value of output created within ECSA. Our thinking right now is that financial markets place a value on this production. In combination with available data like the quantum of tokens in circulation and the balance sheets of ECSA, we believe that the ‘market’ will be able to assess the fundamental current value of ECSA. This is critical for the market needs this information to make clear evaluation of the current value of ECSA tokens.
The lost capacity to value
The problem is that financial markets generally — not just in the cryptotoken market, but across wider asset markets — have been losing that capacity to value. The process of measurement of corporate values is fraught with conceptual problems. And if it is increasingly difficult to estimate the value of assets listed on the NASDAQ or even the S&P100, it might seem impossible in the crypto market.
But innovation of measurement in the crypto economy may actually hold the key to the measurement problems in the more established capital markets. The problem in the conventional markets is that they are being asked to measure things that don’t readily fit into the conventional accounting categories. But the crypto economy is not shackled by those conventional categories, and so can develop the devices to ‘measure the unmeasurable’. That is what a token can do.
Problem in the conventional accounting
So here’s the problem in conventional accounting.
In financial economics, there is a sentiment (and it should be framed no more strongly than that) that companies have a ‘fundamental value’. Somehow, markets will gravitate to an equilibrium that is not just in balance, but that reflects production costs plus a competitive profit. Of course this sentiment is qualified in all sorts of ways: markets aren’t competitive, information is in constant change, etc. But the ontological premise is that there is gravitation to a fundamental value. This is the efficient markets hypothesis.
It was a prevalent notion up to the 1980s, perhaps reaching its zenith in the leveraged, private equity buyouts of the 1980s, which were based on the calculation that a company was worth more when broken up and its ‘parts’ sold than as a going concern. That proposition required a technique to value corporate assets and work out that they were undervalued. That’s fundamentals analysis at work.
Since the 1980s, it has become increasingly difficult to determine the value of companies, including the world’s giant corporations. This is not about a rise of ‘speculation’: if anything, that is the consequence, not the cause. It is about a change in capital accumulation and the rise to prominence of ‘intangible capital’. It is about the entrance of immaterial elements into production and the troubling consequences it has to the old industrial mediations (Virtanen, 2006).
Changed nature of fundamental value
Intangible capital is the investments that produce innovative and untouchable products based on immaterial labour and capital such as knowledge, firm-specific skills, and ‘better ways of doing business’, organizational capabilities etc. (Barnes and McClure, 2009 and Young, 1998).
This sort of capital is not new, but for a century it could be treated as an exception and as an accounting residual, most prominently sitting under the category of ‘goodwill’ (the inexplicable part of why companies may be priced above their technical value). If we go back to the 1929 stock market crash, and the great depression that followed, we find the conditions for a previous new era of valuation. The seminal work here was Graham and Dodd’s Security Analysis, first published in 1934. Benjamin Graham, later called the ‘Dean of Wall Street’ was the teacher/mentor of Warren Buffett. Buffet is an avowed advocate for this style of analysis.
Graham and Dodd sought to develop techniques that the everyday investor could use to put a value on a company and inform their decision to buy/hold/sell shares. They focused on things like the asset type, earnings, dividends, and definite prospects as distinct from (potentially fanciful) market quotations. They didn’t have any truck with day-to-day changes in share prices — what we would call noise or speculative investing; they were concerned with ‘intrinsic value’ of what is often called ‘fundamental value’ or simply ‘the fundamentals’.
It is important that we do not caricature their approach as a mechanical ‘reading off’ of capital value from technical data. They knew the future, which financially impacts on valuation in the present, is unknowable. Their view was not therefore rejecting the role of book value (1940: 585): “We do not think, therefore, that any rules may reasonably be laid down on the subject of book value in relation to market price, except the strong recommendation already made that the purchaser know what he is doing on this score and be satisfied in his own mind that he is acting sensibly.”
In the era of Graham and Dodd, intangibles didn’t appear as part of the calculation of intrinsic value. They were minor and could be ignored. Now they can’t be. As the Table shows, the assets of the world’s largest companies are predominantly intangible.
Companies by Total Intangible Value by 2017 (Global Intangible Finance Tracker, p. 42, June 2017)
When intangible capital changes from being a residual to the predominant form of capital, there is a measurement crisis. Accountants now struggle with putting a value on companies like Google and Facebook: how can the value of their assets be measured when its hard to say exactly what its major assets are? They are intellectual property, information, reputation, brand, goodwill, people’s attention etc. Pricing these is not like pricing a machine in a mid 20th century factory where you know it will run to expiry, and be replaced by something similar. Those can be valued, but this nebulous current stuff can’t.
From ROE to ROI to RCE to what is calculated as a design question
From the 1980s ROE (return on equity) replaced ROI (return on investment) as the performance calculation of companies after the 1980s (Levy, 2014). The effect was that prospective future income streams enter into measurement of ‘value’. For intangible assets, this meant that accountants did not have to give them an explanation; they just needed to estimate an expected future revenue stream on intangibles and discount it to a present value.
The traditional (accounting) valuation methods for ‘high-tech’ companies have been debated, especially after the dot.com boom period (1995–2000). With virtual platforms backed by intangible capital, metrics like price to earnings ratio (P/E), price to earnings to growth ratio (PEG), and enterprise value to earnings before interest and taxes ratio (EV/EBIT) became obsolete. Indeed they were blamed for so-called ‘irrational exuberance’ that was the dot-com bubble.
Alternative measures have started to emerge, though none can be said to have universal legitimacy. Milano et al (2016: 48) contend that investors are looking for an optimal balance of both growth and profitability captured in a measure that is called ‘residual cash earnings’ (RCE): “For students of finance and accounting, this finding makes perfect sense since RCE is a cash-flow-based variant of ‘residual income’, which is viewed by many finance scholars as the single-period, or ‘flow’, measure of corporate operating performance that ties most directly to ‘stock’ measures of corporate value, such as net present value (NPV) and discounted cash flow (DCF), that are supposed to be reflected in stock prices.”
Bringing fundamental value back to crypto economy
In any case. The general implication is that the stock understanding of value measures by accounting profession has been replaced by a flow meaning, giving rise to debate about how to conceive of valuation. As such, intangible capital is being too readily priced on company balance sheets as a call option: the company is selling the share buyer a projected future value of intangible capital, and the share buyer is agreeing, in effect, that this projection is at least accurate, if not an underestimation.
The derivative form means that there is no need to describe or value the thing itself: what is being priced is the exposure (risk) on the value of a thing that itself cannot be completely explained (Bryan and Rafferty 2006). In this accounting technique the problem of explaining intangible capital and its contribution to value is simply avoided. As such, there is no measure of intrinsic value. It isn’t possible to name a speculative bubble.
So, perhaps accountants are asking the wrong questions, trying to value intangible capital in a framework designed for mid 20th century manufacturing, where ownership, machines, and workers were much clearer categories, and where social impacts were understood via profits, rent, interest and wages. This is certainly the view of two leading US academic accountants Baruch Lev and Feng Gu. In their book titled The End of Accounting and the Path Forward for Investors and Managers (Wiley 2016) in which:
“we present the first reason for the waning usefulness of financial (accounting) information — the surge of intangible (intellectual) assets (patents, brands, information technology) — to become the prime value creators of businesses. We document empirically that the failure of the accounting system to reflect the value of these assets in financial reports, to properly account for their impact on firms’ operations, and to provide investors with information about the exposure of these assets to threats of infringement and disruption, is a major cause of accounting’s relevance lost. How ironic (or sad) that largely irrelevant assets to companies’ growth and competitive edge — like inventory, accounts receivable, or plant & machinery — remain prominently displayed on corporate balance sheets, whereas patents, brands, IT, or unique business processes are accounting MIAs.”
Accountants may struggle, but freed from the need to structure accounts around conventionally-conceived profits and loss, ECSA is exploring ways to measure the value of intangibles: at least those that are integral to our project.
Could we think of a thresholding, a binding, an accounting system — a token valuation system — in which organizations operating within ECSA (the so called “economic spaces”, our version of 21st century modes of economic association) nominate the performance criteria by which they want their assets and output to be valued: ways that will specifically address the contributions of intangible capital and immaterial labour, so that social contribution can be recorded in ways that best befit those contributions?
By this parth we would be looking to return to the spirit of Graham and Dodd’s approach to corporate valuation, so that investors and participants can determine some intrinsic value in the activities of ECSA. Pace Graham and Dodd, we are not sayign that valuation will systematically gravitate to this intrinsic value, but that token as this measure provides critical information for investor decision making. The market is not a weighting machine, on which the value of each issue is recorded by an exact and impersonal mechanism, in accordance with its specific qualities. Rather should we say that the market is a voting machine, whereon countless individuals register choices which are the product partly of reason and partly of emotion. (Graham and Dodd, 1940: 28)
(We are working on these issues this week at the Cryptoeconomics Working Sessions at NYU/Stern.)
*Dick Bryan is a prof. of political economy (University of Sydney) and Chief Economist, Economic Space Agency. He is one of the key theorists of the derivative value form, and the author of Risking Together and Capitalism with Derivatives (together with Mike Rafferty).