The important thing about most blockchains, that seems to be missed by everyone

GEO Protocol
GEO Protocol
Published in
5 min readSep 30, 2017

The blockchain, as a core technology of modern decentralized systems, is elegant by design: the way it combines simplicity of the core idea with strong predictability of the final result in distributed networks is truly exciting.

At its core, the idea of cryptographically linked chain of records, that is distributed in the network and proved by the participants through some kind of consensus algorithm, seems to be simple for the implementation, yet powerful against several complex problems of distributed systems.

The blockchain technology gives the community the whole new paradigm of the modern distributed systems. This impact is obvious today. It seems to be so inspiring, that lot of projects are taking this new (?) technology as a basis, and are trying to implement various ideas in decentralized manner. There are decentralized currencies now (bitcoin and lot of altcoins), decentralized smart contracts (ethereum, eos, etc), and even concept of decentralized AI (pandora project) is present at this time.

As a technology, blockchain is very strong in preventing manipulations with the data, that was generated in the past. It is one of the most valuable aspects of this technique, because it gives the community a solution to one of the most complex problems of the decentralized systems — consensus solving.

It is achieved by storing all the history in a cryptographically protected manner, so one minor change to even one record leads to invalidation of the whole history. In combination with simple distribution technique this gives an ability to build decentralized community-driven systems of various purposes with strong protection from various fraud operations.

It is hard to underestimate this opportunity. And the community does not.

The main protection factor of most modern decentralized systems is participants count: the cost of an effective attack grows approximately each one new participant joins the network. There are billions of dollars distributed between participants of several decentralized systems today, and it seems that blockchain is really protects them well, because they are still there.

But let’s look closer to this kind of protection. Is there any trade-off of this design? Is blockchain really so suitable for huge decentralized projects, or is it limited with some kind of unresolved problems?

Lets take financial processing as an example. Visa, one of the world leading companies in financial processing, commits approximately 2k transactions per second in regular day. There are also peak periods, when transactions count significantly grows up (to 24k TPS in 2011). And, also, there is a reserve — the computational potential, that is not used on a regular basis, but is present for huge TPS grows. At this time, this 2k-10k (or even more novadays) TPS is very likely to be processed by hundreds of servers working in tandem. This is likely so, because of limitations of today’s metal solutions: one (or even several) servers are not able to process this huge amount of financial operations, and there is no solution for this problem, except hardware parallelism. Also, some percentage of servers are needed for data replication and loss protection, but we would not take them into account, because blockchain systems are replicated by design and can’t be compared to traditional systems by this metric.

Let’s assume that some kind of decentralized system, that is based on classic blockchain, processes 10k TPS, then it should be able to store 10k records per second in it’s blockchain. This means, that each participant must be able to store 10k TPS.

In case if one record is only 20B long, then there is 10k * 20B ~ 195kB/second would be stored to the disk. It is ~1MB/~5.25 seconds, or ~ 1GB/~85.3 minutes, or ~1TB /~60 days — one terabyte each 2 month. There are only 2 types of computer systems are able to handle this amount of data nowadays — servers and desktops/laptops. No mobile device is present on the public market with internal storage for even 1TB. So, after some period of lifetime our blockchain-based system would be unreachable for mobile devices, and this discrepancy will only increase over time.

There are also hash digests of each block, that was generated by the network. Each digest size is 256b=32B, and we should add this 32B to the previous calculations, but the conclusion will not change and only be strengthened.

But let’s look further. After some period of time, say 2 years, amount of data collected would be ~12TB (in case if network TPS doesn’t increase with time, that is very unlikely). To store this amount of data properly — special techniques are needed (RAID, special file systems, etc). It is the first serious challenge for the network participants: probably only some percentage of them would be able to maintain appropriate infrastructure, to be able to handle this amount of data properly.

But let’s assume, that each participant of the network is able to properly store this amount of data. The same question would be actual also with network traffic, but let’s assume that this one is also covered by the network participants and each one of them is able to receive and send 12 TB*participants count*1% (or less) of network traffic per 2 years, that seems unbelievable without proper infrastructure. In reality, for this purpose the dedicated data-center is needed with direct connections to main traffic pipelines.

But the most serious question here is computational power: to be able to process 20kTPS of financial data, each participant must be able to validate and pack into block each transaction with speed of 1/20k fraction of second, that is unreal for most modern processors that are present on the market today. And it is very unlikely that any code optimizations, or even ASICs would be helpful here. The only solution is the same as for traditional systems — hardware parallelism, and the main it’s consequence is — infrastructure, that is not trivially and very expensive to maintain.

There are only several companies today, that are able to process this amount of financial processing transactions — Visa, MasterCard, EuroCard, etc. This limitation is economically imposed: infrastructure maintaining is to complex for all other companies, that would like to work in this market as well. And there is no revolution on the horizon to change things cardinally.

I can imagine the world where 3–5 companies, say BitVisa, BitMasterCard, etc, would process 100k–1000k TPS and use some kind of blockchain technology for intercommunication between them, but I can’t imagine the world where 1M of MicroVisa and MicroMasterCard are processing 100k-1000k TPS each one in parallel. Otherwise it would be dramatically economically inefficient.

It seems that world needs solution that would be able to distribute all the operations across all participants, but not to impose all the participants to process all the operations as classic blockchain does.

Follow GEO Protocol at:

[ Medium | Twitter | GitHub | Gitter | Telegram | LinkedIn | YouTube ]

Originally published at medium.com on September 30, 2017.

--

--

GEO Protocol
GEO Protocol

Creating a universal ecosystem for value transfer networks