Many of our thoughts on this subject have been informed, stolen from (with permission!) or sparked by the guys and gals at Notation, CoinFund, Figment, Fabric and Multicoin. We would like to thank them for helping us think through this fascinating topic.
Generalized Mining (or ‘mining 2.0’) has emerged as a catchall term that describes the practice of actively participating in one way or another in cryptonetworks in order to generate returns. Different people have been defining it differently, we’re going to take a very broad view on this and then zoom in.
We recently held a Generalized Mining meetup (recap here) with our friends from CoinFund and some of the leading stakers, funds and projects in the space. We’ll leave explaining the Generalized Mining opportunity to Jake, who gave a fantastic Primer in Prague last week.
There’s a lot to unpack here, but the key points are these:
- There is a continuum that runs from simple Bitcoin-type Proof of Work (PoW) Nakamoto Consensus models being used to secure a simple ledger through to more sophisticated Proof of Stake (PoS) and alternative consensus models securing the provisioning of public goods over cryptonetworks.
- There will be near infinite opportunities for participation in these, from simple PoW mining through to staking, validating, provisioning or even arbitraging (e.g. acting as a ’Keeper’ on the Maker network)
- The opportunities sit at the intersection of technology and finance, some are appropriate for a fund, others — we think — aren’t.
If none of the above makes any sense, please take the time to watch the videos from the meetup (Intro, Supply-side, Role of Investors, Staking Economic Design), and read at least some of these catchily-titled posts:
- A primer on mining 2.0
- Livepeer cryptoeconomics as a case study of active participation in decentralized networks
- Notation + Blockchain Mining 2.0
- Crypto borrowing and staking networks
- An introduction to Proof-of-Stake token yields: Assessing risks and rewards
Also useful is CoinFund’s great graphic representation of the space. You can imagine the continuum of consensus models running left to right from Bitcoin to something like Livepeer.
As a potential Limited Partner (LP) in crypto funds like these, we’ll focus the rest of this post on our perspective on Generalized Mining as an investor.
The LPs perspective
The emergence of more sophisticated ‘n-sided-marketplace’ type cryptonetworks (e.g. Augur, Filecoin, Livepeer, Orchid, Kleros…) and entire ‘third party economies’ around them is fascinating and presents some attractive investment opportunities. But it’s not clear that they’re suitable as a fund strategy.
In order to make sense of all of this, it’s useful to try and break these opportunities up into different buckets of business models, risk profiles and return expectations.
Miners and Validators are concerned with expending some resource (or putting some capital at risk) in order to participate in a competition or election as part of a network’s consensus mechanism. For this they earn fees / block rewards / protection from inflation or some other form of monetary reward, generally denominated in the native token. The sum of these can be thought of as the cost of securing the network.
Other players, like transcoders on Livepeer and Miners in Filecoin, are concerned with provisioning some service that is in demand over the network. For this work they earn fees, though generally not in the native token. Instead payment is denominated in some other token (probably some stablecoin). ‘Work token’ type networks fall into this category.
Finally, these networks often throw up opportunities for third parties to generate returns. Sometimes this is intentional (e.g. Keepers in the Maker, payment channels on the Lightning network), and sometimes it isn’t (e.g. front-running bots on Steem). CoinFund refer to these as the ‘third party economy’. This is a real mixed bag in terms of opportunities. Some of these are straight arbitrage, others are liquidity provision, market making, front running, ‘guarding’ (e.g. Fishermen on Polkadot) or just some UI or programmable interface layer that people will pay for.
Let’s dive into each of these as an investment.
Mining and Validating
Our view on this is roughly that the winners in this space will be infrastructure companies and large fiduciaries.
Regarding the economics of simple Proof of Work (PoW) mining, we believe everyone will buy the same chips, from the same 1–3 manufacturers, apply electricity to them, and compete on the basis of (1) access to said chips and (2) cost per kw/h.
Let’s break that down.
“Everyone will buy the same chips”
Chip production is a high research cost and high fixed cost business. Not only is chip design exceedingly hard, getting chips produced and distributed is a high-intensity ‘meatspace’ process requiring enormous capital expenditure.
And design they will: we think it’s going to be very difficult to prevent ASIC mining on valuable networks. In PoW what you’re doing is racing to perform a specific type of calculation. If you can do that faster than anyone else, you win. Those calculations being specific means that at a chip level, you can specialise. Specialisation means optimisation. Take CPUs: they’re incredibly general processors — they can run your OS, process strings, graphics etc. But GPUs have much more specific constraints: they’re slower than CPUs at most things, but much faster at the kind of vector math that’s required specifically for graphics processing. ASICs (Application Specific Integrated Circuits) take this to the logical conclusion by having a single constraint, like ‘SHA256 mining for Bitcoin’. Suddenly you don’t even need your chip to be programmable (!), you can literally write the algorithm into the metal and 1000x performance.
Thus, for as long as PoW models require a specific type of computation, they will come embedded with an economic forcing function for chip optimisation. These chips will be difficult and expensive to produce, therefore production will centralise. Therefore everyone will buy the same chips.
“compete on the basis of … cost per kw/h”
The rest of the profitability equation is basically electricity: The correct answer to cost is ‘below zero’. That’s because there are certain types of electricity production that basically can’t be modulated, like nuclear and fixed solar. In most places and most situations you use the nuclear to provide the ‘base output’, and you have modulable power stations burning more or less diesel or coal to meet the remaining demand. However, in a situation where you incorrectly forecast future demand, you might build excess capacity such that demand is below your base output. In this case, you can’t just dump it, you have to dispose of it, and disposing of a gigawatt of electricity is hard… disposal costs money. So there are places in the world where you can get paid to run your ASICs.
In our view, PoW mining is a commodity-infrastructure game best played by firms that specialise in… infrastructure. This is a traditional, high-fixed-cost industry that has very little to do with tech and a lot more to do with PE financing, GE, Skansa and Vinci.
Moving on to Proof of Stake (PoS) and its variants, our friends from Vest, Chorus One, Rocketpool and Cryptium Labs were on stage to talk about the opportunities and difficulties.
In Proof of Stake (PoS) and delegated PoS (dPoS), miners are staking valuable tokens behind a vote on the network, such as ‘this block should be mined’. If they achieve consensus, they get a reward, if they don’t they suffer some economic penalty (slashing, inflation etc.). From an economic perspective, this looks somewhat like a fixed income product. You are buying an income-generating asset that can suffer large unexpected losses but usually pays out pretty regularly.
As with PoW, this is a pretty binary game: you’re either staking or you’re not. All tokens are equal in terms of earnings power — you cannot stake ‘better’. In other words, there’s no gross margin leverage. There is some operational leverage in the sense that staking requires proper opsec infrastructure (i.e. high fixed costs), but not much. The real leverage here is in delegation. Given staking is a zero marginal cost business (as opposed to — say — car production, which requires components and some assembly cost to produce a marginal unit) there will be excess returns to those stakers that manage large delegated pools.
Switching over to the delegator’s perspective, staking tokens looks a lot like putting cash into a bank account. You’re not going to go deposit your life savings with the guy round the corner. You’re going to go deposit them with a bank you know and trust (or two, but not too many because cognitive load). Delegators will stake their tokens to whoever turns out to be the JP Morgan of crypto. We’re not sure who that will be, but we do think that brand is what will matter here. Whoever can achieve social consensus around their being the ‘most trustworthy’ fiduciary to staked tokens is going to win. As with deposits, this is not going to be a particularly high margin business (since it’s easy to compete), but it will be a winner-take-most (because shelling points).
Provisioning is (yet another) term that gets used pretty loosely in this space. For the purposes of this piece, what we mean by it is the provisioning of a service to an end user across a network. Examples include:
- Transcoders on Livepeer → end users want to watch live-streamed content. Transcoders provision (i.e. ‘supply’) that service by making their compute and bandwidth available to the network.
- Miners in Filecoin → end users want to store files. Miners make their storage space available to the network and regularly submitting proofs that data is being stored correctly.
- Relayers in Orchid → end users want to browse anonymously. Relayers act as routing nodes for this internet traffic, ensuring that identity can be masked.
Most cryptonetworks are basically n-sided marketplaces, where n ≥2. The examples above are all illustrations of this, but even Bitcoin can be thought of as a marketplace: users want a secure ledger in which they can effect ledger movements; miners provide that service in exchange for transaction fees and block rewards.
The common thread is that a service is being demanded by one group and provisioned by others. The protocol mediates the relationship and enforces the rules in much the same way as Uber mediates drivers and riders.
As cryptonetworks proliferate, there’s going to be a near infinite amount of services that can be provisioned — a crypto equivalent to the gig economy. It’s extremely unclear as to whether these constitute an investment strategy suitable for a fund.
The straw man here is Kleros: a decentralised arbitration platform in which two parties can submit a dispute for arbitration to a decentralised network which will ensure a random sample of (sometimes human) arbiters review and rule on a dispute. Arbiters that find the shelling point will collectively pocket the fees paid by the disputing parties for arbitration. There’s a legitimate business opportunity here, but it would be a surprise to see a hedge fund hire a bunch of lawyers and put them to work arbitrating disputes on Kleros.
Provisioning is an economic activity, not an investment strategy. The caveat to this is that it’s still early days and we’re open to be proven wrong by new types of provisioning that would be more suitable. Right now however, we‘re dubious.
The final point to be made is around tax implications. The US has a ‘ securities trading safe harbor’ provision around investments in securities which shield these from certain US tax. This is what allows a UK-based manager to trade Apple shares out of a UK fund for UK taxpayers and not have to also file a US tax return. It’s unlikely that this safe harbor would extend to provisioning, making it difficult for a fund to continue pursuing this as a strategy without an operating company, though some funds we’ve spoken to are working hard on a solution. For those that plough ahead and don’t protect themselves appropriately, if the hammer comes down, it will be a catastrophic event. The reasons why are out of scope here, but it’s the kind of thing that can destroy a fund.
The ‘third party economy’
As the name suggests, there’s a broad range of economic activities that spring up around healthy networks. For our purposes, we’re going to zero in on the trading and risk-taking opportunities, as this is where we see the opportunity for funds.
- Acting as a Keeper on the Maker network → arbitraging Collateralised Debt Position (CDP) contracts.
- Trend following and front-running curation bots on the Steem network → predicting what content will be viral and staking behind it.
- Making markets on Augur (and arbitraging between markets).
In all of these examples, you are:
- interacting directly with a protocol…
- putting capital at risk…
- in order to make a trade…
- with the intention of extracting a profit.
In this respect, such activities don’t look too dissimilar to what a Renaissance, Two Sigma, Jane Street or other quant fund might do. The difference here is one of terrain (cryptonetworks, not the NYSE), not strategy (market-making, arbitrage, short-term price prediction, HFT…)
From the network’s point of view, such activity is generally welcome — and in most cases vital — to the proper functioning of the network. For example, Keepers are an integral part of the Maker network: an active group of arbitrage-keepers help ensure that DAI is properly collateralised, whilst market-maker-keepers add liquidity. Maker itself has built and open-sourced libraries for people to use; it actively encourages the existence of Keepers.
Indeed, if the Decentralised Finance (DeFi) vision is to become a reality, we’re going to need a whole host of such participants to make these markets efficient. Our very own Edward presented a deep dive on interest rates at #DeFi Prague, the tl;dr was that rates, spreads, and yield curves are all over the place and we’re going to need a lot more participants and liquidity to turn these into correctly functioning markets.
Which, finally, brings us to where funds can fit into all of this.
As we see it, there are two types of funds that should be looking into this: those making early-stage investments (we’ll use VC as shorthand), and quant funds.
Early-stage & VC strategies
As the focus for projects shifts from technical risk to execution risk, teams are going to start to feel the pressure to deliver adoption.
We know from traditional marketplace businesses that bootstrapping the supply-side of a marketplace is easier than the demand. There’s no Uber without drivers, and it’s no surprise that when Andrew Chen joined Uber to head up Growth his sole focus was on supply. Crypto teams have learnt the hard way that bootstrapping the demand side just doesn’t work (airdrops anyone?).
It’s a matter of some debate in our team as to whether there’s an opportunity for VCs to add value by actively provisioning and nurturing their networks. One argument is that the ability to do so will become a competitive edge for VCs trying to get into the most competitive deals. And if there’s one thing VCs know it’s that marketplace outcomes are winner-take-most. The best deals will be competitive. The best VCs are strategising now about how they’ll get into them.
The ability to provision networks during the ‘zero to one’ phase is the first substantive answer we’ve heard to what a value-add early stage investor looks like. LPs should pay attention because it’s not just marketplaces that follow a power law distribution of returns — it’s VC funds too.
On the other hand, is it really a VC’s place to take on such an operational role? Who will pay for the hardware costs involved in provisioning networks, and for the engineers to build these systems? VCs should advise and connect, but should it act as an outsourced Growth consultancy? Matters complicate even more in open-ended funds. How do you apportion costs fairly when investors are free to subscribe and redeem as they wish?
The sweet spot here is probably that early-stage funds will act as ‘day 1’ provisioners and generalized miners on the networks they’ve already backed financially, where the provision of that service requires little to no hardware outlays. The primary aim will be to participate in and encourage the provisioning of the network during the first few months after mainnet launch. Particularly on networks like Livepeer and Orchid, the network simply isn’t functional below a certain threshold of correctly distributed nodes / transcoders / relayers due to the latency issues. Bootstrapping up to that point is clearly valuable.
Furthermore, provisioning these network from Day 1 can also be a lucrative activity. New networks need to compete with established ones for the attention of not just users, but suppliers. These miners need to be incentivised with juicy returns to move from a known network they’ve already built software, infosec, opsec and processes for to a new network with no libraries, ecosystem or liquidity. A VC that’s already backed the project will inevitably have informational advantages around understanding (and influencing) the token dynamics and how best to provision the network, giving them an unfair edge in earning those returns during the early days of the network when competition is scarce or non-existent.
Provisioning the network from Day 1 isn’t just the opportunity for VCs to increase the chance of success of the network, but to earn additional tokens with minimal capital outlay, thereby reducing their average cost of ownership on the position. Notation actually outlined some of these economics in this post. The team invested in Livepeer tokens pre-launch at $[redacted], and subsequently used those tokens as stake to provision the network and earn additional LPT at 30c/LPT — substantially below the seed round’s valuation. This is happening. Indeed, in networks with an election mechanism for validator / miner selection (e.g. EOS, Livepeer), owning tokens that you can stake behind yourself may be the only way to gain the right to perform work.
Our favourite early-stage funds are thinking deeply about how to position themselves to best help founders on their deepest pain points, while being careful not to stray out of the bounds of what makes sense to do as a fund rather than an operating company. Indeed some, like Figment, are deliberately splitting their activities between a fund (Figment Capital) and an opco (Figment Networks), where FN is a core holding of FC. Though probably tax-inefficient, this helps you get around the potential ‘safe harbor’ tax implications discussed above. Figment believe that actively participating in networks helps them understand them better and informs decisions on the fund side, while the early investment gets them the access to best understand how to best to actively participate. It’s a strong argument.
Generalized Mining is an obvious new avenue of exploration for quant-type (aka algorithmically defined and executed) strategies.
There’s the opportunity here to participate in networks as a ‘third party’ as described above. Such funds should manage to earn steady, incremental and repetitive returns. These opportunities live at the intersection of technology and finance, so the teams will need to be deeply technical — again, not unsimilar to a traditional quant fund.
The sweet spot here would be to build a portfolio composed of multiple strategies running in parallel, with capital being reallocated between them based on the opportunity set at a given time. Such funds may also choose to optimise for a high Sharpe ratio (i.e. high volatility-adjusted returns), resulting in a stable and uncorrelated bond-like return earned through the judicious provision of liquidity and arbitrage of various forms of rates, prices, spreads, and yields across multiple networks.
If cryptonetworks become the pieces of core public infrastructure as we think they will, there will be a place for this type of strategy in most institutional portfolios, alongside REITs and regulated utilities. Once upon a time, nobody but the craziest capitalists invested in electricity.
When started working on our fund of funds strategy a year ago we were convinced that the emergence of thousands of these networks was going to open up a plurality of crypto-native financial opportunities: things that are only possible because of public decentralised blockchains. We’ve learnt a lot over the last twelve months, and it has only served to reinforce this belief.
This is just the beginning. Traditional strategies will continue to port over to crypto and these will be obvious and well understood. Meanwhile, a subset of managers are going to dedicate the time to examining the functioning of these networks to the point of deep intimacy, and they will build strategies around arbitraging the opportunities they find. In doing so they will help the networks achieve the efficiency and depth required to graduate them from experiments into global public infrastructure. We look forward to finding, exchanging with and investing in these managers.