Contentious Blocksize Wars

Marek "Slush" Palatinus
Braiins | Slush Pool
7 min readMar 22, 2016

--

Disclaimer: This is my personal opinion and it’s not the official stance of Slush Pool.

I have not been adding any comments on the blocksize discussion until now and I have several reasons for that. The first one is pretty clear — opinions were said, sides were taken. Also, I feel like the discussion is way too irrational as it is and anything you say can, and at some point will be used against you. The last reason? To me, it just does not feel like my opinion has such a great weight. Yes, I do own a mining pool, but our philosophy is to stay neutral and our main interest is to offer the miners the best mining infrastructure possible. The only hashrate controlled by me personally is this one:

My only miner hashing at 300MHash/s, used for SlushPool testing purposes :-).

Yet people keep asking me which side did I take. At this moment I must highlight that everything that is written in this article is my personal opinion valid at the moment of publishing, not a statement of SatoshiLabs or SlushPool (which stays strictly neutral and let users mine even for proposals which I don’t prefer). Okay, let’s go out with this:

I’m for raising blocksize limit (maybe even removing it completely). Still, at this moment, I’m with Core Devs.

Let me explain why.

Do we need a blocksize limit?

Regardless the necessity of the blocksize raise, the limit has been introduced as a precaution for potential DoS attacks to network. When Satoshi introduced the limit back in 2010, it was a very smart decision. But now, we have years of experience with running Bitcoin network and there are ways how to improve the transaction validation in order to reduce such attacks. After removing these attack vectors, I don’t think the blocksize limit will be needed at all.

Removing the blocksize limit

Sure, Bitcoin nodes would also need some other upgrades to lower the stale block ratio, but there are many straightforward proposals like Thin Blocks, which improve block propagation and lower the latency for miners at no extra cost. The main reason why such optimizations are not widely deployed is that nobody needs them at 1MB limit.

By fixing these obvious mistakes on P2P protocol (which requires just a software update, not even a soft fork) and removing blocksize limit completely, free market would set up balance between transaction fees (pushing miners to produce larger blocks) and stale block ratio (pushing miners to produce smaller blocks). Just as any other free market approach, this may introduce uncertainty in short term perspective, but it would work perfectly in long term perspective. Also Gavin Andresen made an intensive research on this topic and I like all his contributions to the discussion in general.

Even though removing the limit is technically possible and I don’t share the disasterous visions if doing so, I fully understand that removing it completely (or raise it exponentially as proposed by BIP101) is basically impossible for political reasons today. So if we want to stay down to earth and find some real consensus for hardfork, pushing this solution is pointless (waving to Mike).

Raising the blocksize limit

Safer compromise between removing the limit and status quo. Bitcoin Classic, Bitcoin Unlimited, BIP100 and SegWit. Four of the most popular proposals.

In the first place, SegWit is a non-hardfork proposal repairing some quirks in protocol, which need to be fixed for long term blockchain scaling. However I like it mostly for another reasons and I don’t see it’s immediate impact to blocksize. It would require major changes in the wallets to have at least some small effect.

BIP100 suggests kind of a floating limit, which introduces a brand new kind of uncertainty into economic model to Bitcoin network. It has known 21% attack to keep blocksize artifically small (please correct me if this has been fixed in some recent revision). Also, there’s no implementation yet. Although I value many of Jeff’s insights to discussion, so far this looks like his homework more than a serious solution.

Bitcoin Unlimited introduces ‘user-defined hard limits’, basically trying to freemarket blocksize. Although quite like the high level motivation and their recent work on Xtreme Thinblocks, I see some technical problems there. If I understand the proposal well, small nodes may set lower blocksize hard limit, which might lead to situation where blocks produced by large miners would become invisible for small ones, which may lead to network split. I would rather remove the limit explicitly and simplify the code. Anyway, this is politically dead solution for the same reason as my favorite solution described above.

Bitcoin Classic seems to be the most popular possibility in these days. Out of all the alternatives, Proposal for 2MB hardfork is the most conservative one. Also they’re not trying to introduce any kind of new mechanism, which I like. However, even for raising limit to 2MB, we need to implement hardfork. Is hardfork justifiable for such temporary fix? Which leads me to the following question:

Do we even need to raise the blocksize limit now?

I mean, although raising or even removing the blocksize limit is technically doable today, is it really necessary in current situation? Is that the solution for current Bitcoin issues?

This is where the topic gets rather philosophical than technical: What is the purpose of Bitcoin? Is it a payment network? Is it like a digital gold? Answers to such questions are highly subjective and the result in even more theoretical question: What is the real merit of Bitcoin? Low fees or the fact that its rules cannot be changed easily? And who’s responsible for such decision? Peter “sipa” Wuille, author of SegWit concept and my favorite Core developer, has some good points why he’s against easygoing changes in the protocol. Although I don’t fully share all his concerns, it has to be said that:

Bitcoin with 1MB limit is not broken. It just cannot compete with VISA.

But it can still compete with gold. Anyway, I like the comparison of Bitcoin to toilet paper on public toilets (slightly rephrased from Samson Mow’s interview):

If someone loots all the toilet paper from the public toilet, is the solution to double the amount of toilet paper stocked each day?

Although transaction rate in Bitcoin network is on rise which is generally a good thing, I’m convinced a lot of such transactions are spam per se and many other transactions are just a freeloading of the shared resource.

By freeloading I mean reckless blockchain usage by services like famous SatoshiDice roulette. They profited from the fact that transactions were for free or very cheap, considering the real costs of running Bitcoin network at given scale. Doubling or quadrupling the limit alone wouldn’t help, but rising transaction fees driven by market demand of more valuable transactions did the job. Just to remind readers, back at the days, transactions were completely for free. Today it is normal to pay 4 cents or more. Did it harm Bitcoin in some significant way? Most likely no, except the fact that some services had to optimize their business model.

In the end, I don’t share the concerns that full blocks are such a big deal. Taking the economic perspective, it is natural to fill shared resource until it’s for free or very cheap. As the resource is filled, there’s free market pressure to make it more expensive, maintaining the balance between freeloaders and actually valuable transactions on blockchain. With hard limit on blocksize this will just come earlier than with unlimited blocks, but the freemarket effect and rise of transaction fees are inevitable anyway. So my answer to the question whether hardfork yes or not: It would be nice, but it is not necessary yet.

The main problem of these days is that there’s no good infrastructure to determine sufficient fees or changing fees for already broadcasted transactions. However Core Devs are aware of this and they’re making some good progress: CPFP (Child Pay For Parent) mining is in development, RBF (Replace By Fee) has been already rolled in Core 0.12.

So you’re a Blockstream puppet?

Um, those conspiracy theories and overheated discussion with personal accusations are harming Bitcoin much worse than the actual blocksize issue itself. I actually don’t have any motivation to take any side for any reason, I only want Bitcoin to be successful (whatever that actually is). Although there are various proposals and it would be nice to do something, I really don’t think Core Devs are trying to destroy Bitcoin.

The biggest problem with Core Devs is that they’re extremely smart, but many of them are lacking basic soft skills combined with self-righteousness or they don’t feel an urge to communicate with the community. I have been frustrated by communicating with Core Devs many times and I was vocal about that. Their terrible soft skills lead to PR disasters and some wild speculations. However I feel this is changing in recent weeks as Core Devs have to defense themselves against rising competition coming from another forks. I noticed they’re slowly dismissing their arrogant attitude and they are trying to explain their intentions more clearly even for wider community. This, combined with their excellence knowledge in cryptography and programming, is the reason why I want to give them one more chance.

Disclaimer

Blocksize debate is really complicated and has a lot of economical, political and technical consequences. I did not try to explain everything in detail, rather to put my subjective thoughts on topic. I highly recommend to read more resources from Gavin, Sipa and Jeff. They spent insane amount of time with the topic and they provide calm, factual and high quality insights. From non-developers I recommend to read Aaron Van Wirdum’s articles on BitcoinMagazine, who tries to provide unbiased view on the debate.

--

--

Marek "Slush" Palatinus
Braiins | Slush Pool

Innovator and #Bitcoin evangelist. Founder of world’s first bitcoin mining pool @slush_pool and #Trezor hardware wallet. CEO @SatoshiLabs.