What’s new in V8? [I/O 2017]

Ruslan Nigmatulin
4 min readMay 20, 2017

--

This is just a short version of Seth’s Thompson keynote “V8, Advanced JavaScript, & the Next Performance Frontier (Google I/O ‘17)”.

As you may know (or may not know) Chrome browser has an engine called V8 which compiles your JS code into native machine code through the JIT (just-in-time compilation). It generates faster machine code, but you pay the price of longer initial delay (initialization of your JS-code).

So the more V8 code optimizations, the more V8 memory consumption.

Back in 2015 V8 engine looked this:

In 2016 Google added WebAssembly (former ASM.js).

This year Google optimized V8 again and we have a new, simplified execution pipeline, which basically consist of Ignition and TurboFan.

The first one, Ignition, is an interpreter, which compile your JS code into bytecode and saves memory on all platforms (not just low-memory mobile devices).

The second one isTurbofan, is an optimizing compiler, which fully supports ES2015+.

So basically Ignition compiles you JS-code into bytecode and sends it to TurboFan for optimization. In the end you get your fully optimized bytecode for execution.

But there is also another part (which is not presented on the chart) called Orinoco.

Orinoco is a parallell and concurrent garbage collector (previous GC was not always parallell, that’s why Orinoco is faster than previous one).

So what benefits we get from this new V8? Well, obviously speed performance increase. Accoding to the tests with Speedometer (benchmark tool) the increase is reaching 35%.

But let’s talk a little bit about ES2015+ features.

Originally, when feature from ES2015 were implemented in V8 they were a lot slower than transpiled versions (ES5 code). Hopefully with new V8 it’s all in the past and many of ES2015+ features are as fast as transpiled versions.

There are also good news for Node.js fans.

Debugging Node-apps also became much easier:

But the most important feature for me was the Coverage. This feature shows you the amount of unused code and even marks unused code blocks (available in Chrome Canary).

Finally, a few anouncements about WebAssembly. Not browser support is much better:

In the neareset feature performance of WebAssembly code will be increased, multithreading will be added and it will be possible to use it with ES2015 features like Response object.

Overall, there are less and less reasons not use ES2015 native code, because now it’s almost fast as transpiled code. And new, even faster version of V8, allows you to give the best web experience to your lovely users.

--

--