The Fabric Ventures Investment Thesis
A Generational Platform Shift
History
We are at the start of a paradigm shift in software architecture: the wave of decentralised data networks. The movement we have seen build up over the past few years goes beyond Bitcoin and other crypto-assets, or even open source software and blockchains. Taking a broader view, it is the victory of peer-to-peer data networks based on open standards, it reflects the power of properly aligned economic incentives, and it begins to harness the individual data centres in everyone’s pocket, desktop, car, living room and wrist. The movement is made possible by the proliferation of access to high speed wireless broadband, rapidly maturing cloud-native software and a surge of recent machine learning advancements.
Over the past few decades, technology architecture advancements have commoditised operating systems and software packages by making them globally accessible via data centres and cloud infrastructure. In this new wave, the data centre is being spread to the very edge of the network and the data itself is being ‘open-sourced’, commoditised into reusable trusted building blocks. Distributed users and machines interact with this data via a substrate of peer-to-peer networks. These peer-to-peer data networks become a ‘fabric’ that validates and curates information inputs without the need for third parties, while empowering individual users with their own data in a usable, secure and scalable manner.
While Google quietly discarded its “Don’t be evil” motto, here at Fabric Ventures, we are much more interested in a new software architecture, where the motto becomes “Can’t be evil”.
The Sovereign Individual
By empowering users with their own data, the decentralised data networks wave is breaking down the colossal data silos that have been the lifeblood, but increasingly the untenable liability of today’s tech giants. Driven by central leaks such as Equifax’s data breach that cost 145m Americans their privacy or Facebook’s revelation that Cambridge Analytica used personal data from 87m accounts and recently a further 50m logins were compromised, users are becoming weary of entrusting their personal data to these centralised silos. While Google quietly discarded its “Don’t be evil”¹ motto, here at Fabric Ventures, we are much more interested in a new software architecture, where the motto becomes “Can’t be evil”²: an architecture, in which all users control their data locally, therefore enabling the rise of the sovereign individual. As Yuval Harari posited at a recent TED conference³, totalitarianism is not the ugly force that is often depicted in retrospect; instead, it presents itself as a seductively simple solution to the problems of the day that can seem foolish to resist. Harari argued that the concentration of data-sets within certain commercial or governmental organisations may be tempting them to master and personalise this technique to a degree hitherto unimagined. We need organisational structures, data architectures, incentives and technologies that neutralise this danger. Beyond eliminating the possibility that third parties behave in an evil manner, the applications built upon these decentralised data networks will make possible a much more intimate bond of trust between each human individual and the computing services delivered via an ever more complete envelope of devices.
A shift to Human-Centric Computing
While this trust layer has become a commonality between individuals and communities, it has proven difficult to create in human dealings with hardware devices and software applications. As the potential for software to serve us in increasingly intimate and personalised ways expands, the imperative to deliver a trusted relationship increases. This becomes mandatory when one is sharing personal genetic and physiological information, relying on algorithms to make split-second trade offs in moments of human peril or even when filtering the stream of news that constitutes the factbase for our day to day decision-making.
In “Sapiens”, Yuval Harari also explores the idea that ultimately, on a historical scale, it is our ability to conceptualise abstract concepts through language and share common beliefs that allowed strangers to cooperate and communities to rise. Technology now has the ability to abstract this trust, with the right incentives encoded at the protocol level, allowing for cooperation and trade at a global scale.
The forthcoming wave of decentralised data networks shifts us from zero-sum game capitalism to the compounded benefits of collaborating communities. Perhaps a good way to imagine such evolving coordination with minimal central control is the technological feat that is the modern city. Distributed Autonomous Organisations built upon blockchain technology can achieve a balance of resilience & efficiency, coordination & incentive on a whole range of new frontiers. The replacement of the owner’s profit for a shared interest in the level of adoption of the network aligns incentives across builders, service providers and users: the substitution of top down command and control power structures, with their high propensity for corruption, with emergent structures.
Step Back — This is the Latest Platform Paradigm Shift
Taking a step back and navigating through the business models of technology companies over the past century
To understand where the value might emerge and hence where as investors and company builders we might fruitfully focus our efforts, it is useful to take the long view over multiple eras of technology fuelled disruption.
Let’s pick the story up in the postwar American boom of the 1950s and 1960s. As US firms became multinationals, the business model of technology companies was primarily to take a margin on the expensive production of proprietary computer hardware. The result was that computers remained in the hands of a limited group of users, consisting of governments, businesses and wealthy individuals. As the production costs of microprocessors plummeted, a new architecture for computing with radically improved economics flipped the power in the industry from the proprietary hardware systems to the chip manufacturers and software companies. Where Tom Watson at IBM had trouble imagining the need for more than a handful of computers, Bill Gates of Microsoft understood the future would see a personal computer in every home.
With the democratisation of access to personal computers, the 1970s and 1980s led to a new wave in which technology companies shifted their business models to selling cheap hardware with a licensed operating system. With the fulgurant rise of Microsoft and its relentless focus on winning over developers, consumers chose the platforms with the greatest array of compatible applications, and the company’s Windows operating system began to propagate across hardware providers and unified almost all software packages under one roof. By the year 2000, Microsoft’s market share was above 90% of all personal computers sold and it accrued most of its value from the operating software and application software layers.
However, while Microsoft held the desktop domination, it had never secured the servers in the wiring closet and data centers — these remained the domain of the most successful Unix workstation companies of the 80’s (Sun, Silicon Graphics and IBM). In the early 90’s Linus Torvalds sought to undermine this expensive hegemony with a cheaper & more open alternative: Linux — an open source version of the Unix operating system for software servers. A new wave of technology businesses was unleashed by combining commodity hardware and the Linux operating system with Apache web servers, MySQL and PHP. By 2012, Microsoft’s share of the compute market had fallen to 20%⁴, while by 2017, Linux based Android captured 85% of the mobile compute market.⁵
This democratisation of access to cheaper software coupled with widely available networks, pushed technology businesses to shift their business models towards offering free software & networks with the intent of monetising the data they gather. The tech titans of today have co-opted open source software, and combined it with massive monopolised silos of user data, to create competitive moats protecting trillions of dollars of market capitalisation. However, as incumbents are facing more and more data usage issues, users are starting to scrutinise the ownership of their data and governments are pushing through extensive data protection regulation (e.g GDPR).
Following the democratisation of the hardware, the operating system, the software and the networks, the new paradigm shift we are observing will open up access to the data within networks. As the existing data silos break down, we will observe the commoditisation, in the sense of packaging and access, of the data itself. The question remains — what will the business model of technology companies become, once the data monopolisation angle has eroded or died?
To answer this question, we re-trace the history of open source software development in relation to the motivations & monetisation methods that accompanied it. At its very beginning, the free software movement emerged amongst privacy & security hobbyists, hackers and government entities that realised they could not commercialise their software. The movement was heavily hinged on the ethical belief that software should be in the open domain, available and accessible to all.
Developers realised that, beyond making software freely available, the open source model fundamentally improved the software development process. Communities established themselves around projects and reputation layers started settling within these communities, generating exponential growth in contributors, maintainers and users. With the ability to widely distribute software across the world, developers started forming companies that added very thin monetisation layers onto wide distribution networks. In 1993 Bob Young incorporated the ACC Corporation that sold Linux and Unix accessories and later became Red Hat. Around the same time, in 1994, Monty Widenius (Advisor to Fabric Ventures & Founder at OpenOcean) began working on MySQL which consolidated itself into the Linux, Apache, MySQL & Python, or ‘LAMP’ stack⁶ and became the world’s most popular database before being acquired by Sun Microsystems for $1 billion in 2008.
Over the past two decades, as large corporations realised the viability and benefits of open source development, the entire world has become reliant on open source software. The React and React-native JavaScript development tools are primarily maintained by Facebook while Google has made countless contributions to Android, Kubernetes and Go amongst others. Microsoft, which would have been considered as the primary opponent of open source less than 20 years ago, has reinvented itself into the company supporting the largest number of open source developers in 2017 and recently acquired Github for $7.5b. The tech giants have co-opted open source software, have mostly stopped charging for software, and have built their entire business around monetising their users’ data instead: creating trillions of dollars of market cap by using software that they don’t own, and data that isn’t theirs.
Tech giants have created trillions of dollars of market cap by using software that they don’t own, and data that isn’t theirs.
Unfortunately, in this third wave of open source software development, the developers have lost the ethical aspirations and romantic motivations that drove the first wave, often without benefiting from the financial upside or the reputational rewards that drove the second wave.
With Satoshi Nakamoto’s 2008 release of the Bitcoin whitepaper, we’ve entered into the fourth age of open source software development: by solving the ‘double spend’ problem and creating ‘digital scarcity’ in decentralised systems, Satoshi lay the foundation enabling the integration of a digital value transfer layer within peer to peer networks. This fundamental architecture breakthrough enables open source networks to reward and incentivise contributors without a central authority or sponsor. The permissionless innovation on open platforms and ‘trustless’ peer to peer networks combined with token-driven incentive and governance systems has started a Cambrian explosion of developers and ecosystems uniting around open source projects. We believe that the open source development movement has finally found its elusive ‘business model’ — a model which doesn’t necessarily reward a single central entity, but instead fairly incentivises all contributors and participants, creating a distributed digital economy within each network.
A Paradigm Shift Driven by Three Mega Trends
The success of the Web 2.0 Era over the past two decades has largely been dominated by three fundamental technologies: Cloud, Social & Mobile. We’re convinced that the coming decades of technical innovation will be driven by the interplay between Edge Computing, Machine Learning and Decentralised Data: the edge computing on swarms of devices capturing millions of data points, the advancements in machine learning algorithms ingesting this wealth of data & a substrate of decentralised data enabling secure & scalable communication, coordination and fair incentivisation.
Compounding these three technological waves will open up a wealth of data that is currently still locked away for privacy, trust, or competitive reasons. In 2010, the world produced around 1 zettabyte of data.⁷ According to McKinsey, in 2016, the world produced 16 zettabytes of data, and yet only analysed 1% of it.⁸ By 2025, the world’s data generation is expected to surpass 160 zettabytes.⁹ Previously untapped data sources, privacy preserving provenance and granular value distribution, will lead to currently still inconceivable breakthroughs such as personalised medical predictions via genomic data, coordination of distributed autonomous agents and unlocking of new unexplored monetisation methods for the data generators.
However, if the siloed data structures of today are not rapidly upgraded, the proliferation of available data and the efficiency of machine learning algorithms could quickly lead us to a dystopian future of surveillance capitalism and politics capable of not only predicting, but tapping into our emotions to ‘benevolently’ prescribe decisions we are yet to make. In effect, acting on the future they have foreseen with immense precision and with objectives that are neither necessarily transparent nor aligned with our own. Decentralised data architectures will not only prevent tech giants and other data monopolies from gaining such pervasive power, but will instead enable individual actors to enhance their lives with this new wave of applications whilst they both maintain control over their privacy and gain financial returns from their participation.
Novelty of Tokens & Cryptoeconomics
A fundamental problem that has historically plagued network architects of all varieties can be simplified down to the mismatch between value creation by a network and value capture by an equity structure. Equity structures derive their value from the future cash flows generated by the central company’s ability to extract revenues from its customers at a net profit. A system that worked well for companies selling goods and services: Apple selling premium hardware or Netflix and Spotify selling monthly memberships. The equity model does however result in a dangerous divergence of interests when applied to networks in which the core value lies within the cheap distribution and user driven content creation: Twitter has had difficulties monetising the content created by its user base, Facebook had to turn to an almost dystopian & panoptical model to monetise their user base & open source networks never managed to properly monetise their full value creation. While the community generates the valuable content inside a network, the user transforms from the customer to the product itself. The fundamental mismatch lies in the fact that a central entity is attempting to capture the entire value created by the community of users, which receives no financial upside in return.
By moving away from a central equity company governing the network, and instead modeling the network as a digital economy with a native token, we can not only increase the value captured, but also distribute it to the actual value creators. This digital economy uses tokens as manifestations of digital scarcity within a network, which are used to incentivise distributed people, machines and other actors to contribute and manage valuable resource, work and usage. Representing the digital scarcity of the network (e.g. compute power, human labour, content creation or governance) as a digital token renders it upgradable and infinitely flexible. These tokens become a programmable digital software link between humans and the assets they own — both virtual (e.g. personal data) and physical (e.g. real estate). What these tokens enable, is a cleverly architected balance of network-intrinsic stake & utility for users, developers, resource providers (e.g. miners), and capital providers (e.g. investors), achieved through Token Economics — the new frontier for incentive mechanisms design. As tokenisation allows a re-imagining of ownership on a wholesale basis even beyond pure digital assets, existing assets will enjoy the potential for improved liquidity, transparency, access, compliance and taxation which will drive their tokenisation and the ultimate supremacy of the new crypto capital markets.
A couple of decades back, we witnessed data and content shifting from analogue to digital distribution. This allowed everything from creation to distribution to monetization to be reimagined. The impact of digitisation on newspaper, television and movie content is well documented: new titans like Netflix and Spotify have arisen and firms like Blockbuster and Kodak are marginalised or gone. We have now become convinced that tokenisation will be to ownership as digitisation was to content.
“Tokenisation is to ownership as digitisation was to content.”
Types of Tokens
Diving into the universe of tokens, we classify the different characteristics into three core categories: Currencies & Commodities, Utility Tokens and Security Tokens. Bearing in mind that any individual token might exhibit several of the characteristics at the same time or even evolve their characteristics over their life cycle of the underlying network, we have summarised our view of these characteristics below:
- Store of Value (SoV) Tokens rely on their censorship resistant and peer to peer transaction features to ensure a store of value that is completely uncorrelated to any other market, commodity or currency. Examples include the likes of Bitcoin, Monero and Zcash which all have slight variations on speed of transactions, security of the network & privacy of the network. These come closest to being the equivalent of currencies and when considering the quantitative theory of money, their dynamics might be understood with the equation of exchange (MV = PQ).
- Stablecoins aim to decouple the volatility from cryptoassets and provide a digital asset that is pegged to a fiat counterpart (e.g. USD) and mainly used as a unit of account & medium of exchange. The 3 large categories include:
a. Centralised IOU Issuance — kept stable by an equal reserve of fiat that is centrally held.
b. Collateral Backed — over-collateralised by cryptoassets such as Ether escrowed trustlessly.
c. Seignorage Shares — recreating an algorithmic central bank that keeps stability with levers on supply and demand.
- Payment Tokens have been the simplest and most iterated version — they’ve often been forcibly implemented into networks as the sole method of payment for the digital asset provided by the network. As a result, they come close to being currencies within a digital economy, without ever becoming investable, liquid or stable enough to become stores of value. Instead, at a future equilibrium, they’ll look closer to a form of working capital, which users will try to minimise due to the opportunity cost of capital. As a result, they are likely to end up with an extremely high velocity, but a low value accrual. Through the nature of open source code (copyable & forkable), these token models run at a high risk of being forked and replaced by equivalent protocols that enable payments in a proper SoV token.
- Security Tokens are a tokenised representation of assets ranging from traditional commodities & equities, to pieces of art and all the way to pieces of virtual land in the form of crypto collectibles. The former rely on a strong guarantee of ownership of the underlying asset and can be valued by the worth of their underlying asset with a premium for liquidity, divisibility and accessibility. The latter often represents scarce digital assets that are valued like art or real estate — i.e. fame of creator, location in a digital landscape & overall demand for asset.
- Governance Tokens give the holders a vote in how a network is run, where developers focus their efforts & when software upgrades should be implemented. As the value of a network goes up — via the number of companies running on top of it or the number of transactions it handles — the ability to influence its development will become a scarce resource. The price of the voting power in such a network is in fact likely to scale exponentially with the value it secures. This token feature is generally combined with one the aforementioned token designs.
- Discount Tokens give owners the right to discounts on purchases of an asset provided by a digital network. Buying into a discount token can be equated to buying into a coop, and getting the right to a set discount percentage of all economic activity within the network. As the value and activity of the network grows, the holder of the token can claim a larger value of discount — effectively modelling a royalty fee that can only be claimed in terms of network services (no monetary payouts).
- Work Tokens operate on the idea that service providers need to have ‘skin in the game’ to be incentivised to provide high quality work for the network. Whether it’s objective work such as computational resources or subjective work such as qualitative ratings, service providers are obliged to stake a certain amount of tokens into the network in exchange of the right to provide profitable work. If the work is done ‘correctly’ the service provider is awarded the fees paid by the user (not necessarily in the native token). In contrast, if the service provider acts maliciously, their stake is slashed & distributed to other service providers. As the network grows in usage, there will be an increased amount of immediate and future profitable work to be delivered, leading to a rise in service providers desiring to deliver this work. As a result, the demand for these work tokens will increase and due to their fixed supply, the price of the tokens should rise with usage of the network. We’ve further explored the value derived by work tokens in a dedicated post.
- Burn & Mint Equilibrium Tokens are architected on two simple characteristics: users of the network pay for services with the tokens, but instead of paying fees, they burn their tokens (denominated in USD) & at the same time there is a constant inflation process of new tokens being minted (denominated in native token). As the service providers are referenced by the users of the network for each burned token, they receive an allocation of the newly minted tokens as payment. As a result, when the platform usage increases and users burn more tokens than are issued through inflation, the supply will decrease and push up the price per token.
What’s At Stake for Investors
The role of an investor investing in this plethora of token models must evolve from a simple capital allocator to an active participant within the networks. From community building, to token engineering, to actively running nodes, to actively managing liquid positions, venture capital funds active in this space will soon be required to become operationally involved within networks. This fulfills their fiduciary duty of maximising the financial return on capital for their LPs, and also helps bootstrap the networks in which they are invested. Ambitious founders who understand that the path to enduring success will be one of twists and turns will initially turn towards patient institutional partners, that inject both capital and work into their networks, and only once the networks are live with a minimum viable number of nodes, will they start attracting specific user groups or other relevant stakeholders through targeted sales or air-drops of tokens.
The most active investors will continuously engage with networks on multiple fronts over the duration of their investment:
- Staking: As Proof of Stake (PoS) or Delegated Proof of Stake (DPoS) networks go live on mainnet, token holders will have the ability to stake their tokens in order to provide profitable work to the network (validating transactions, computing, arbitration, transcoding or providing security) and be rewarded by the network (e.g. block rewards) or by the users (e.g. transaction fees). In DPoS networks, operators within the network can do the work as a service for a share of payouts, and token holders can delegate/bond their tokens to the operators.
- Voting: Numerous networks are using their tokens as a governance tool — whether through simple token voting, quadratic voting or liquid democracy, the tokens give a voice to their holders. Long term investors will participate in the governance process of the network, and steer it in the direction of their best interest.
- Curating: With Token Curated Registries (TCR), early investors in such networks will need to actively participate in the curation process, simultaneously keeping the registry at high quality, as well as signalling the quality of curators involved.
- Running Nodes & Simple Usage of Network: As investors might also be users of networks, they might actively build early iterations of use-cases for the networks. From running nodes within the network for their own data-driven tracking purposes to actively participating within the network’s economy (e.g. buying services/assets), these investors will initially seed the ecosystems built on top of the networks.
Beyond simply committing to actively participate in portfolio networks, Fabric has already begun setting up the infrastructure within certain testnets. Some of our early experiments of running nodes within the Ocean Protocol testnet are documented in a dedicated blog post.
Europe: a Prime Mover
Developer explosion & 100s of years of academic strength
The army of developers and technical talent stems from Europe’s legacy spawning 100s of years of technical academic excellence. Europe is home to 5 of the top 10 technical universities in the world and is year after year graduating twice as many STEM PhDs as the US.¹⁰ According to Stack Overflow, Europe houses 5.5m developers compared to 4.4m in the US.¹¹ The technical talent has always been present in Europe — but historically the banking sector was the dominant developer employer, and only after 2008 did the technical talent decide to flee its clutches. And now, for the first time, with this wave that’s democratising access to capital, the technical talent doesn’t need to migrate to the US to raise venture capital and build global companies. As a result, the value created by founders coming from different corners of Europe has significantly outraced any other geographic region: $4.1bn has been raised by European projects in 2018 alone — versus $2.3bn in Asia, and $2.6bn in the US.
Distributed Teams from the Start
As a continent made of 50 vastly different countries, Europe is accustomed to working in distributed teams as well as building with a global outlook. Any nascent company in Europe has always recognised the need to build an international roadmap from day one to avoid the traps of small domestic economies. Diverse and multinational teams have become the standard and London, Berlin, Paris and Amsterdam have all consolidated as hubs for technical innovation. We call this the “Technaissance of the European City State”.
Entrenched Counter Culture
Europe’s largest advantage probably roots from its history spanning over the past centuries. While the centralised business model fit the Silicon Valley ideals of dense capital pools and narrowly aligned ideologies, the decentralised network model fits Europe’s entrenched history of political fractures and ensuing counterculture movements. Europeans have lived through two unthinkably disastrous world wars fought on home soil. Europeans have been brought up in families that lived through communist and/or fascist dictatorships. Europeans were split through the cold war for over 40 years. Europeans have lived through totalitarian regimes that annihilated the idea of privacy and freedom of speech and regularly seized personal goods and assets from their own citizens. Decentralised networks will certainly deliver tactical benefits to individuals but also hold the promise of keeping in check society-wide nightmares that Europeans have experienced directly and acutely over the past century.
Regulation
From a regulatory perspective, the EU has been on the forefront by pushing directives such as GDPR to protect the digital data rights of the consumer, and PSD2 to open up the financial system to the benefit of the user. Beyond that, a race between countries has begun to provide the clearest and most welcoming regulatory framework for the new business models and governance structures of decentralised networks. France’s Minister of Economy has openly stated that France will not “miss out on the blockchain revolution” and will become a global hub for ICOs¹². Switzerland has already issued a very clear regulatory framework for token based networks.¹³ The UK Chancellor has announced a crypto asset task force to create a constructive regulatory framework for tokens.¹⁴ Smaller countries like Malta and Lichtenstein have sprinted ahead with open regulatory approaches attempting to become global hubs for decentralised networks and related businesses. On the more general data usage side, the US has been noted for its untempered business-driven approach to the use of AI, and China for the totalitarian over-reach that AI might enable. There is talk of a ‘third way’, towards Ethical AI, that Europe might yet pioneer. A path that builds upon the framework provided by GDPR and embraces innovations in cryptography in conjunction with the power of AI to deliver ever more personalised services whilst still respecting personal privacy.¹⁵ ¹⁶
The Fabric Playbook
Our number one objective as a fund is to be singled out as a partner for the finest and most discerning entrepreneurs and technologists in our domain. These next chapters outline in practical terms our approach to achieving this and to delivering further value to these networks over time.
Grass-Roots Sourcing
While previous technology waves enabled new business models to flourish on top of the new tech platforms, the move to decentralised data networks allows monetisation within the base protocols; at the application and also the business layers. As a result, contrary to previous technology waves, even more of the early value creation can be heavily concentrated in the technical infrastructure layers built by developers for developers.
There is inevitably a minimum time it takes for consumer facing applications to be iterated to sufficient usability, with significant raw technological advances between Web 2.0 and Web 3.0 blending into the background of successful applications. Consumers will in turn overcome certain psychological barriers in use-cases that did not previously feel natural — and hence adopt new habits. New products and new markets will be supported by new go-to-market strategies and forms of distribution — such as financial rewards, token airdrops and decentralised app stores.
We believe that particularly in the coming 2–3 years the majority of interesting projects will be focused on the infrastructure layer: the developer tools that will serve teams in areas ranging from smart contract libraries to data management frameworks; the ‘picks and shovels’ that will facilitate the transition to Web 3.0 for both retail and institutional investors and the underlying protocols that will provide computation, storage and data privacy.
Consequently, our craft as investors will focus on sourcing innovative projects from developers, reviewing their Github commits as a leading indicator of development and adoption, and assessing their fit within the decentralised applications stack. In a continuation of an ongoing shift, the greatest investments of tomorrow will be discovered by attending (and contributing to) hackathons instead of startup pitch competitions. In a time where software is eating the world¹⁷, and open source is eating closed source software, the developers are the next kingmakers. The developers are those with whom we will be spending our time.
Sectors Ripe for Disruption
While it is undeniable that Bitcoin introduced a viable digital store of value as an alternative to government-issued currencies and gold, by coupling a limited supply with an immutable ledger, we are convinced this wave of technology will bring a wealth of new opportunities beyond just a form of money. In the realm of finance, we believe compliance will be automatically built into the transfer function of any and every asset, credit scores & insurance premiums will be dynamically adapting to a myriad of data sources on a global scale and the whole notion of capital markets is on the brink of being modularised into programmatic financial primitives: a set of open and permissionless building blocks readily assembled for any financial application using data sets in every market of any assets one can imagine.
There is an equally large opportunity to reimagine supply chain management from the ground up. From inventory tracking and provenance verification across multiple providers to automated credit financing and auditing. In the automotive space specifically, enabling cross manufacturer car to car data sharing and vehicle tracking opens up a whole new set of possible operations & interactions. Online, tokenisation is enabling global ownership of digital assets for the first time — starting with simple crypto-collectibles but inevitably evolving towards sovereign identity and authentication. Peer-to-peer marketplaces for data, software licences and work providers can flourish in their true form — without an AirBnB or Uber taking a cut on every transaction.
By abstracting the element of trust, which has led to the prevalence of middlemen we observe today, decentralised networks will not only fundamentally change our perspective of how businesses operate across all existing sectors, but also introduce entirely new business models — native to the peer to peer environment.
Active Network Participation
As previously touched upon — on the role of an investor in the space — we have seen the opportunities for capital providers evolving radically. By simply locking away our assets in custody, we would not merely miss out on potential returns from network inflation, but in many cases we would even be hurting the projects with our neglect. These digital assets are intended to incentivise all actors to participate in the value creation: from capital to work and usage.
Active network participation can take the form of validating blocks in Proof of Stake networks in exchange for block rewards or capturing tokens in a Merkle Mine. Additionally, it can focus on providing desirable resources to a network in exchange for fees paid by users, such as storage, data, or registry curation. With the right incentives designed into these networks, individuals and specialised service providers will rapidly offer their services to earn rewards and fees. In the case of competing decentralised networks, network providers will follow the network participants, and these network participants will probably converge towards networks with numerous providers (which can lead to a chicken-and-egg situation). Additionally, there is a risk that network participation is not economically viable in the early days of a network when observed in a vacuum: relying on transaction fees in a network with no transactions or providing data streams to the Ocean Protocol Network before there are any buyers will generate losses. It is specifically this niche of scenarios that no rational actor will cater to, other than an early investor. The returns from a VC fund are predominantly driven by maximising each investment’s chances of outlier success, and as a result, this potential for exponential results in a network outweighs the sunk costs of provisioning and seeding the network. As a result, we intend to help our portfolio networks kickstart their supply-side providers, whether internally or through delegation, and we openly commit to portfolio projects to actively participate within their networks. We have already outlined our plans to run nodes, provide data and stake towards data streams within the Ocean Protocol Network and are working with numerous of our portfolio companies to be ready to provide work to the networks as they launch.
Accelerating Serendipity
We are strong believers in the value of an engaged community and connecting smart people to make extraordinary things happen. By immersing ourselves in this community, we will not only be in the position to source the best opportunities at the earliest stages, but we will also curate a network that will contribute to the successful growth of our portfolio. To this purpose, we host a range of events, varying in format and purpose — each designed to bring together a specific subgroup of individuals and solve particular problems:
- Regular technical meetups — focused on bringing together the developer community to collaboratively problem-solve specific research & development issues around infrastructure, token economics, and adoption;
- Larger-scale conferences like CogX — focused on introducing the Web 3.0 to a wider audience — presenting latest achievements in research & development, and helping corporates, governments, and investors become more comfortable with newly emerging business models;
- Annual Founders Summit — an intimate gathering designed to showcase the development in portfolio projects and proactively match them with useful partners in the ecosystem;
- Talent dinners — where we proactively match talented engineers, product designers, cryptographers, community builders, or operations super-stars with projects in our portfolio.
Fabric House
We are building Fabric House with a firm commitment to surround ourselves with the smartest developers in London working on decentralised data networks. Through the nature of open source software development, most projects will have distributed teams without a firm ‘home base’. We believe a flexible coworking space for developers building decentralised networks doesn’t only provide the opportunity for cross-pollination between projects, but will also serve as one of our greatest assets for sourcing investments. Equally, whenever we spend time in Berlin, we are residents at FullNode — a coworking space for blockchain projects built by Gnosis and Cosmos.
State of the Token Market Research Reports
In January 2018, we launched our regular State of the Token Market reports, focusing on quantitative trends in the token ecosystem, as well as qualitative analyses of significant developments in the space. Deep-diving into aspects such as the evolution of funding mechanisms, insights into crypto hubs globally, trends in the development activity as a function of market corrections, and the emerging regulatory landscape — has helped us not only stay at the forefront of the evolving European and global scene, but also put Fabric on the radar with established and emerging projects, talent exploring the ecosystem, academics, and industry associations.
Translated into 4 languages and quoted in over 100 media publications and several notable talks — including university lectures — the report in its two editions has become a reference point for developments in the space. Going forward, we intend to supplement the report with more regular newsletters including select data fragments.
Human Fabric
Just as it was in the Web 2.0 era, the fuel of innovation in the decentralised Web — and the most scarce resource for which the best projects will compete — is talent. As we aim to invest in projects right after their inception, we expect that team expansion will be one of the top priorities we can get involved with. As a result, we are building a curated community of talented engineers, product managers, and community builders, who we programmatically match with relevant projects in our portfolio. In a first instance, we use the social mapping platform we have built in-house, to proactively identify promising individuals across Github, Twitter, Reddit, and other networks, and make tailored recommendations based on projects they have committed to previously, their location, interests and reputation. Additionally, we host a regular series of events and dinners focused on further cultivating the physical community, and allowing portfolio projects to share their vision.
Instrument Agnostic
Having kept pace with the rapid rate of change in investment instruments over the past 2 years, we will not shy away from further iterations. From equity to tokens to SAFTs to SAFTEs to Security Tokens, the ideal value-capturing investment instrument that fosters both governance and milestone driven funding is still being improved. To avoid any future technical restrictions, Fabric Ventures is asset agnostic. Instead of ignoring new investment instruments and being precluded from new opportunities, we will embrace further iterations of instruments enabling us to work on network participation, custody and liquidity management.
Liquidity & Returns to LPs
Despite the liquid nature of token markets, Fabric Ventures is fully committed to a long term patient capital approach to investing in digital networks. We invest in teams that are building meaningful technology and growing user adoption — and that takes years if not decades.
The liquid market does however provide the ability to mark our portfolio to market on a daily basis and to double down on undervalued positions we already hold. Towards the end of the lifetime of the fund, unlike traditional equity shares, tokens will be much easier to sell without negatively affecting the project. Instead of having to push for an IPO or acquisition, we will have the ability to sell to accredited investors on the open market with exchanges that are fully regulated and potentially even include on-chain compliance frameworks, while also keeping the option to sell tokens to a larger buyer in an OTC sale. There is also an opportunity to make way for capital with a different risk/return lens or for strategic actors within the network, which might ultimately increase the value of the remaining tokens. We may however also choose to sell out of a successful project that nonetheless no longer promises venture level returns, or indeed be forced to exit rapidly should we believe a team or project has shifted from its course at the moment of our investment. These ‘liquidity modes’ remain a domain of further development and iteration — just as this broader “fork” of the venture capital model itself.
Conclusion
This is not just simply a generational shift in computing architecture, it is a dislocation in organisational principles. A new wave of human-centric services will be interwoven with our everyday lives with unprecedented intimacy, and for humans to trust their machine counterparts not to abuse the growing torrents of accessible data, we will need this layer of crypto-powered privacy and incentives. We believe there is a long journey ahead to build a scalable, secure and privacy-preserving Web3.0 — starting with the technical infrastructure, developer tools and data management frameworks. To support this vision, Fabric Ventures is adapting the patient venture capital model to investing in decentralised data networks: backing the boldest technologists & communities at the earliest stages, supporting them throughout their journey and becoming active participants within the networks they are building.
References
- https://www.fastcompany.com/3056389/why-google-was-smart-to-drop-its-dont-be-evil-motto
- https://medium.com/@muneeb/cant-be-evil-bc5ec16c6306
- https://www.ted.com/talks/yuval_noah_harari_why_fascism_is_so_tempting_and_how_your_data_could_power_it?language=en
- http://old.seattletimes.com/html/microsoftpri0/2019853243_goldman_sachs_microsoft_os_has_gone_from_more_than.html
- https://www.idc.com/promo/smartphone-market-share/os
- https://en.wikipedia.org/wiki/LAMP_(software_bundle)
- https://www.apixel.com.sg/blog/the-zettabyte-to-bring-more-business-and-success-to-singapores-economy/
- http://www.mckinsey.com/business-functions/digital-mckinsey/our-insights/the-internet-of-things-the-value-of-digitizing-the-physical-world
- https://www.storagenewsletter.com/2017/04/05/total-ww-data-to-reach-163-zettabytes-by-2025-idc/
- Atomico State of European Tech 2017
- Stack Overflow
- https://www.ft.com/content/2e7b2778-2d22-11e8-9b4b-bc4b9f08f381
- https://www.ft.com/content/52820f90-1307-11e8-940e-08320fc2a277
- https://www.coindesk.com/the-uk-government-is-launching-a-cryptocurrency-task-force/
- https://www.gov.uk/government/news/world-leading-expert-demis-hassabis-to-advise-new-government-office-for-artificial-intelligence
- https://www.bloomberg.com/news/articles/2018-12-18/eu-presses-for-ethical-ai-with-new-guidelines-for-companies
- https://a16z.com/2016/08/20/why-software-is-eating-the-world/