Blockchain tech is not decentralized — it is distributed. The difference matters.

Diego Doval
4 min readJan 14, 2019

--

Blockchain technology is many things. One thing is isn’t is decentralized.

Why is that?

It’s because they share a single data source that is copied around, and to join the network a node needs to have a full copy of the data source.

In purely technical terms, blockchains as generally understood today are a single globally shared database with eventual consistency, with a copy distributed across all participants in the network.

That is, they are distributed, not decentralized.

Now — let me make the caveat that there’s new types of blockchains coming out, so “as generally understood today,” as of early January 2019, is an important point to keep in mind. At least, Bitcoin and Ethereum both match this description, and Lightning does not change the single-database nature of the system.

Distributed vs Decentralized

So, blockchain is distributed but not decentralized.

In a truly decentralized system, nodes can operate independently with a “view of the world” that is centered on themselves and their local peers, whatever “local” means in their case. It could be the nodes are geographically all over the world, but logically next to each other, or viceversa.

In a truly decentralized system nodes can eventually reach a state of consistency with most of the network — or not. In fact a very large scale distributed system would never reach full a-priory consistency between two random nodes, and it wouldn’t need to.

A blockchain node, on the other hand, can’t operate without having synchronized its state with the global blockchain state, either directly — by downloading the whole chain of blocks, or at minimum the hash data for validation — or indirectly, using a trusted proxy/service that does that for you (e.g. Coinbase).

Every blockchain “full node” has the same data as any other full node. In a truly decentralized system, this would never be the case.

What about Lightning?

Lightning is a good idea that delays and reduces the impact of the eventual-consistency problem, but doesn’t eliminate it. While Lightning does theoretically enable the capacity to have “millions or billions of transactions per second” it does so for specific use cases, not for all use cases. Lightning is still far from performing as advertised, but that’s not the real issue. The issue is what use cases it is relying on to improve performance.

This may sound like small-print disclosure, but it’s a fundamental limitation in the technology. The operation of the lightning channel itself requires writes to the master ledger/blockchain. A lightning payment channel involves to some degree a higher level of trust, but blockchain is used primarily for trustless operations. The only way Lightning “solves” the eventual long-term scalability problem of blockchain is if the vast majority of use cases can be accommodated to the ones Lightning supports. Given the scales we’re talking about, “the vast majority” has to be something like 99.999999%.

In summary, by supporting specific use cases that operate outside the main blockchain, Lightning has a good shot (when fully operational) at increasing the capacity of blockchains as a whole — but it cannot solve the fundamental problem that everything is still, eventually, writing to what amounts to a giant unicode file being copied around the world at blazing-fast speed.

Ok, not decentralized: is that good or bad?

Whether blockchains are decentralized or not matter mostly to specific long-term scalability problems.

Because today blockchain depends on a single shared database (or ledger) effectively there’s only one write operation that can be valid at any given time. This write operation can include a lot of data, affecting many transactions, and indeed it does, but it is still one write operation worldwide.

And we can get better at allocating time-slices for different operations, but it is still the case that it leaves us with one valid write operation worldwide. And while time itself is infinitely sliceable, we don’t have infinite computing capacity, so we can’t hide in the space between microseconds that we can’t get to today because of performance constraints.

What this means is that blockchain as it exists today cannot physically scale to support the entirety of the world’s financial system, which requires billions of transactions per second, involving both algorithmic and human actors.

Is that good or bad? I think it’s generally not relevant to the present. somewhat relevant to the immediate future, and critical long-term.

As we have seen, the evolution of blockchain technology is really fast. The fact that everything in these ledgers in centralized but widely distributed is a good starting point because we can then figure out how to break processing and storage up into reasonable units, instead of having to do it in advance.

Just don’t measure blockchain as succeeding or failing as a “decentralized system”, simply because it isn’t one.

--

--

Diego Doval

Avid reader, occasional writer. Drexel CS Alum, Trinity College Dublin PhD CS. Previously CTO & CPO at Ning, Inc. Now building n3xt! (http://bit.ly/whatsn3xt)