How Decentralised Technology can Help Restore Trust in our Economy?
Sometimes a movie comes out that makes an impression. One of these for me was The Big Short, released in 2015 and directed by Adam McKay. This movie brilliantly (and accurately) exposed the context behind the global financial crisis that hit us in 2008 and continues impacting our societies on economic but also political and social levels. Here is my takeaway from this masterpiece:
- This financial crisis wasn’t exactly accidental; bankers and financial analysts turned a blind eye on the foreseeable disastrous consequences following the creation of complex and risky mortgage-based instruments: the infamous CDOs. More than that, credit rating agencies, such as Standard & Poor’s, displayed dishonest behaviour, enabling the banks in their fraudulent activities.
- The division of authority was a pretty facade hiding those who shamelessly abused our trust and counted on low- and middle-class citizens to absorb the cost of their gamble. Austerity policies arose; meanwhile, wealth disparity continued increasing.
- Another critical yet perhaps more subtle message carried out in this movie was around the societal consequences induced by the financial crisis: banks and politicians did not take the blame and instead pined it on immigrants. This proved to be right: our political system got more polarised, societies reject the political elites who chose to spare the protagonists of the crisis, which led to the current rise of populist movements.
Back in 2008, while our central authorities spectacularly demonstrated how they failed to protect trust in our economy, a paper was published that introduced a new protocol allowing the transfer of digital cash without trust in a third party. A noticeable feature of that paper was that it was published by someone (or a group of individuals) who decided to remain anonymous. He, she or they, used the pseudonym of Satoshi Nakamoto.
This wasn’t exactly the first work published around digital cash. However, it enabled, as history proved, the first successful decentralised system for digital cash transfers: the so-called Bitcoin blockchain.
In 2010, when a pizza was bought with Bitcoin, a new currency arose. A new currency that was, and remains, limited in supply, and was thought to preserve its users’ anonymity. A new currency that was at the time, relatively easy to “earn”. The code was open source and easy to install on a computer.
You could quickly spin up a node, or more precisely a “mining node” and by doing so, join the band of anonymous miners who competed to validate transactions and create blocks of these.
Roughly every 10 minutes, the miner who successfully produced a valid block would be rewarded with new bitcoins in exchange of their compute power dedicated to building the next new block of transactions appended to the blockchain.
Importantly, Bitcoin was the living proof that a trusted electronic cash/digital asset transfer system could exist in a trustless environment. This opened the door for a new economic era, one where no more intermediaries/middlemen would no longer be needed to establish trust in our economy.
Blockchain was a provably secure system, offering transparency to its users and thus accountability for their actions, immutability and thus data traceability and non-repudiation, as well as accessibility and privacy to all. In that respect, it was offering a direct solution to growing problems faced by the centralised system presenting opaque audit layers, undermining our privacy, being increasingly hacked and notoriously excluding communities from our economy.
One perhaps less desirable reason behind Bitcoin’s growing notoriety around mid-2010 was its token price volatility. The last noticeable crypto-bubble was in 2017, where numerous token prices increased steadily throughout the year and then dramatically fell early in 2018. Although there is still a great deal of speculation around cryptocurrencies, the hype seems to be fading away, leaving space for blockchain and more broadly distributed ledger technology to find real use cases.
As Bitcoin code is open source, many blockchains were created over the last ten years, each trying to address some of the limitations carried by this pioneering distributed network.
Blockchain has undoubtedly revealed the capability for distributed ledger technologies to disrupt our business models. Bitcoin was the first successful public blockchain to demonstrate the potential for this new technology to be used as a decentralised yet trusted store of value.
Building on this early success, next-generation blockchains, such as Ethereum and Neo, demonstrated the potential for blockchain platforms to provide decentralised computing services, enabling more complex applications and reaching more markets than straight forward storage of value.
However, the much-anticipated adoption of decentralised computing using blockchain and DLT is still to happen. Yet the enthusiasm for blockchains has proven a market exists for it. But these early blockchains also demonstrated the challenges of scaling large public networks to support industrial use cases, as well as highlighting many other challenges including keeping running costs low and stable, and supporting data types other than tabular data.
Before I introduce the work that we have been doing at Atlas City, let’s review the problems that new blockchain technologies have failed to overcome.
The Scalability Challenge
Traditional blockchains maintain an ever-growing chain of blocks of data that allows any node to go back and check the full history of every token maintained on the blockchain as well as any data stored on the blockchain, including in some cases smart contracts.
While this feature is handy for creating trust in the network and auditability of the data stored on the blockchain it does introduce a significant problem known as blockchain bloating: the blockchain grows so large that it becomes impossible to maintain a complete history of the blockchain on a personal computer in order to participate in the ledger management.
In reality, it is rarely needed to validate the complete history of a ledger to authorise a single asset transfer.
Concerns around democracy and sustainability
The production of a valid block is often based on a competitive process, and on a public blockchain, anyone in principle is welcome to join the competition. In Bitcoin, the algorithm used during the competition is called Proof-of-Work (PoW) and is designed to allow participants in the network to reach consensus without trusting one another.
With the PoW algorithm mining nodes collect and validate all transactions broadcast to the network and form a block with these new transactions. The miners compete to solve a computationally hard problem, the solution of which is used to prove that a block is valid and can, therefore, be appended to the blockchain.
The solution thus enacts as the “proof of work” performed by a miner. Although the solution to the cryptographic puzzle is hard to find, it is very easy to verify which allows a quick and secure update of the blockchain.
The level of difficulty attached to the cryptographic problem solved by the miners is set by the network to ensure that blocks are produced at a regular time interval. It is not driven by the transaction throughput. Its sole purpose is to maintain a high level of security on the network across time.
Indeed, the difficulty target increases as the miners invest in more powerful and expensive computers in the attempt to win the competition and earn more reward. This, in turn, leads to a high risk of mining centralisation. Whilst the first bitcoin blocks were mined by individuals with modest computer resource few miners work independently nowadays. Most join mining pools where they share their computer resources and their collected rewards.
Around 80% of the mining pools are currently located in China where the electricity is considerably cheaper than in other parts of the world. One can imagine a scenario where a pool or set of pools could decide to tamper with the ledger for economic and/or geopolitical reasons.
Such a competition-based consensus algorithm also poses a world threat that simply cannot be ignored nowadays, that is the excessive use of energy required to operate such a system. The energy consumption per year for Ethereum and Bitcoin combined is currently comparable to that of Switzerland.
For a distributed ledger project to truly become a digital economy enabler, it must be environmentally-friendly and indeed managed in a distributed manner.
A popular alternative to PoW currently considered by several blockchain projects is the Proof-of-Stake algorithm (PoS). This approach partially addresses the footprint concerns from the former by assigning the task of producing the next valid bock to a subset of miners.
The miner nodes can be selected randomly or based on criteria such as the miner’s wealth (stake). The main concern with a PoS-based consensus mechanism remains the risk for centralisation of wealth and subsequently the network management, with the mining work inevitably distributed to a few wealthy nodes.
Concerns around privacy
Another challenge of DLT projects is individual privacy. Public blockchains often lack privacy layers around individual actions. Flow analysis of assets publicly available on a blockchain combined with online data has proven successful at revealing users’ identity.
Following the scandals of misuse of our data by social media platforms and governmental entities, a series of laws and regulations have been put in place across the world to reaffirm the privacy of users and their virtual selves.
DLT projects must offer services that respect our privacy and are compliant with the rules and regulation of our countries.
Our project at Atlas City: Catalyst Network
Atlas City started from scratch by first surveying operational requirements and existing limitations to create a new distributed computer system capable of overcoming these limitations and deliver the necessary requirements.
The codebase developed by Atlas City engineers, named Catalyst, is original — does not fork from any other code base — and will be made available as open-source software. Catalyst is a full-stack distributed network written using the .NET Core framework.
Learning from existing blockchains and distributed ledgers as well as the broader IT industry, we have outlined vital objectives that we believe forms the set of requirements that must be met by Catalyst.
Become increasingly decentralised at scale.
Catalyst network must be able to run nodes on limited resource devices, such as IoT devices, as well as those with more considerable computing power. Collectively across a network of nodes, there are significant distributed computer resources to maintain a ledger securely. Network performance should, as a result, improve as the network scales up.
This led us to rethink the architecture of Catalyst ledger as well as the consensus mechanism employed to manage the ledger, to eliminate bloating effects and be capable of scaling to meet future data and distributed service demands. In my next article, I will outline the data architecture of Catalyst ledger that makes it possible for anyone (or anything) to maintain a copy of it.
Promote sustainability and accessibility to all.
One of the innovations behind Catalyst, which I will also describe in this series of articles is its consensus mechanism. We designed it with the idea in mind that anyone should be welcome to contribute and earn from the network, not just people who can afford expensive mining equipment or large stakes.
As such, it is not based on a competitive process. Instead, the nodes in the network collaborate to build the correct update of the ledger state collectively. The algorithm used by nodes to produce a valid ledger state update does not require the execution of computationally expensive tasks, thus allowing nodes with limited resources to contribute while promoting sustainability.
This new consensus mechanism follows the principle of a decentralised voting system where each node can vote on the correct update of a ledger state update and is rewarded for their contribution.
Welcome industry knowledge and standards.
Catalyst network must be able to benefit from the investment in skills and technology already made by businesses, by allowing decentralised Applications (dApps) to be developed in popular industry programming languages and frameworks. It should have recognisable and straightforward pricing models for dApps, more in line with cloud computing, and allow rich file types such as documents and videos to be stored and shared efficiently.
We designed Catalyst to enable web3 and a new generation of online services, that respect the privacy and confidentiality of users: decentralised messaging, email and web applications which give the user control of their data while creating new markets for online services.
We are at the dawn of a new technological era which has the potential to help re-establish the trust in our economy we lost over a decade ago. Like any new technology, the first blockchains came with weaknesses that needed to be addressed before their promises for a fairer world can be fulfilled. We, at Atlas City, embraced these challenges when developing Catalyst. As a new distributed network Catalyst improves upon those that came before.
Catalyst was designed around the notion that a democratic and ethical network can exist which is secure, decentralised, scalable and private. Its code base includes original and innovative work, including a new collaborative and environmentally-friendly consensus-based protocol, the possibility to process both confidential and non-confidential transactions as well as smart contracts, an efficient peer-to-peer communication layer and a multi-levelled data architecture for a lean ledger database storing a variety of data.
You can learn more about the relevant technical characteristics of the network by reading the Introduction to the Catalyst Network and in the consensus paper that we recently released. In the following weeks, Joseph Kearney and I will be posting medium articles that provide our readers with technical insights around Catalyst ledger architecture and its new consensus mechanism.
Whilst Joseph and I are delighted to present you with some of the research that has been put into the design of Catalyst, we leave it to the fantastic team of engineers at Atlas City (Darren Priestnall, NshCore, franssl, Stephen Horsfall) to give you their detailed insight into the technology they have been building.