Ethereum Constantinople Fork Timer (ECFT)

Ali Benjelloun
Stake Capital
Published in
4 min readFeb 26, 2019

Research

At block 7,280,000, Ethereum will upgrade to Constantinople. If you are wondering when this will take place, we have built a clock for you, available on Stake Capital’s homepage. Stake Capital provides highly available and secure blockchain validation services for the leading proof-of-stake protocols. In this short article, we will describe how this clock was designed. To learn more, you can view our Difficulty Adjustment Calculations (spreadsheet) and Github.

The impact of EIP-1234

Miners in Proof-of-Work race to solve a mathematical puzzle to earn the right to mine the next block, and the difficulty of this puzzle is adjusted periodically to maintain a constant block time of ~ 15 seconds as the competing computing power on Ethereum fluctuates. Since solving these mathematical puzzles is largely random, the difficulty must be increased when more computing power is competing, as otherwise blocks would on average be mined too fast, increasing the issuance rate in the process. Ethereum will introduce a gradual Ice Age which in summary increases the difficulty regardless, slowly decreasing the block issuance rate. This is to provide a smooth transition as Ethereum progresses on the journey towards Proof-of-Stake.

The Ice Age should have been introduced previously, but was postponed by 5 million blocks (~1.5 years) with EIP1234 — which is also the EIP that reduced block rewards from 5 to 3 ETH. Since then, clients calculate the difficulty based on a fake block number « suggesting the difficulty bomb is adjusting around 5 million blocks later than previously » which represent a 1100 days delay. In practice, the difficulty is derived from 3 parameters:

  1. The bomb’s ticks, increasing the difficulty at an exponential rate.
  2. The parent block time difficulty, evolving regarding the bomb’s ticks.
  3. The leverage, making the network able to re-estimate the difficulty regarding congestion or mining popularity.

« current_block_difficulty = int(2**((fake_block_number // 100000) — 2)) + parent_block_difficulty + (parent_block_difficulty // 2048) * max(1 — (block_time// 10), -99) »

The timestamps and the difficulty represent the most sensitives inputs, and taking the previous block’s value is not precise enough. To reach acceptable accuracy, we concluded that a 2,000-block average was the network safeguard snapshot (as the last 100-block stats are not the absolute law, it is the bigger range which is not impacting more than 1% the global growth of the block difficulty).

Technical Design

An average of both block difficulty and block time from the past x blocks is used to calculate the latest estimate. The more consistent / accurate this average is, the stronger the overall estimate. These block difficulty / time averages can either be created on the fly (e.g. past 100 blocks) or generated before and stored as static data in the component. We have discussed the merits of both options, and decided to go with generating the averages using the past 2,000 blocks and storing it as a static result in the component. This is because generating the average for more than about the last 100 blocks begins to create browser lag (the average is generated at page load on the client-side). Obviously, one solution would be to generate the averages on the backend. However, we decided that, for now, using the last 2,000 blocks (< ~1% change in exponential growth of block difficulty) would provide a sufficient estimate — and yield far more accuracy than creating the averages from 100 blocks, even though this estimate would be on the fly (100 most recent blocks vs older static data for 2,000 blocks).

I would like to thank Julien Bouteloup and Leopold Joy for their valuable feedback.

--

--