09Photo by frank mckenna on Unsplash

History of a Rewrite (part 2)

Marco Solazzi
9 min readMar 19, 2018

This is the second part of History of a rewrite. In the first part I introduced the reasons why I took the journey to rewrite my portfolio from React to Vue.js.

In this second part I will discuss more in detail the steps I made to improve the overall performance of the website. It’s going to be a pretty long article, so bring a coffee and… let’s get started.

Codebase refresh

While I was porting everything to .vue files, I also made some improvements and updates to the codebase. I really didn't want to rewrite the entire HTML and the underlying logic, but one year has passed and some technologies and patterns has changed. Also Vue.js enables some pattern that are impossible in React and vice-versa.

Intersection Observers

The biggest change has been dropping scroll and swipe event handlers in favor of the Intersection Observer API. Browser support is pretty solid (had to add a polyfill for IE though) and the switch accomplished a double goal: improve scroll performances (scroll events are slow!) and abstract the interaction from the input mode. Sadly I wasn’t able to remove them for good, for the first page — what I call the cover page — enter/leave transition is controlled by a wheel event listener. To mitigate that I used the .passive modifier in order to leverage passive event listeners.

That said, I’m not an expert on scroll performance measurement but here is a before-after screenshot from Chrome DevTools.

Profile using React and scroll events
Profile using Vue.js and Intersection Observers

Polyfills, polyfills everywhere

Since Vue.js is compatible with IE9+ and I wasn’t using any particular ES6 object other than Promises, the other significant improvement was removing babel-polyfill and importing from core-js just the required modules.
That saved me around 40KB in size.

Importing just relevant polyfills

The latest version of my boilerplate is using babel-preset-env which should replace calls to babel-polyfill with just what's needed as of your browserlist configuration. Anyway I use IE11 and the preset doesn't scan for actual features usage (it’s an experimental feature though) so I end up loading Map and Set even if I wasn't using them.

Other size-shavings

To improve performances I managed to shave some more size from the website. First of all I optimized the images I was loading (shame on me for not having done that in the first release). That saved me 80KB.

Images weight before (~200KB) and after (~120KB) optimization

Then I removed or replaced the lodash functions I was importing in the v1. Lodash is a great library, but even loading individual functions modules you could end up bundling a rather large amount of code. In my case I was importing debounce, omit, has and isPlainObject.

debounce was used on window resize event, but that's a pretty rare event in real world browsing experience; so debouncing it was a premature optimization. The other functions were just useless (for I used them in React components) or just replaceable with ES functionalities and ad-hoc utilities, so I end up removing them for good.
My total saving on the vendors bundle was 26KB while application increased by just 0.3KB.

Assets Preload

For the finishing touch, I implemented HTML5 <link rel="preload"> functionality for CSS and JavaScript assets. This feature is not supported by all major browsers at the time of writing but it will be soon and anyway the code to implement it is just ignored by non supporting browsers. I’m not the right person to explain the details of the preload (this is a great introductory article), but it instructs the browser on how to manage discoverability, fetching and parsing of your external assets in an asynchronous way.

It’s a fairly simple task but it changes drastically how the browser handles your assets. Here is a before / after screenshot of the Chrome network panel

Plain old assets inclusion
Network waterfall with rel=”preload”

As you can see the browser discovers earlier the resources I marked with rel="preload" and prioritizes them even if they come later in the document.

For example, while in the first screenshot Google Fonts (css?family=Bree+Serif...) are loaded as soon as they are discovered, in the second screenshot you can notice that they are loaded later, for I prioritize the JavaScript assets and my stylesheet in order to render as earlier as possible.

What about css-modules?

This chapter is going to be pretty short. I like css-modules and have used it on some project by now. It’s a clean solution for CSS in JS interoperability when you cannot or don’t want to use styled-components. Moreover, paired with PostCSS and CSSNext it virtually allows you to drop Sass in favor of the latest CSS goodness.

Anyway it needs to store a reference for every components classes you write in the CSS into the application bundle; and if you use the camelCase: true option you end up with two reference for every dashed class name. Even by using a 5 chars hash string you could end up adding few kilobytes to the final bundle (plus the parse overhead).

That said, using the scoped styles feature of vue-loader seems to be a better option to keep a fairly stable CSS scoping while keeping your bundle size low.
I’ll be fair with you: I was too lazy to test that…

Conclusion

So, given my assumption what are the outcomes?

Here is the visualization from webpack-bundle-analyzer of the final build:

And here are some metrics collected by Lighthouse with a mid-tier mobile preset on a fast 3G network:

It is pretty clear that the vendor bundle size has decreased in size. I was surprised that the application bundle size was more or less the same. vue-loader applies some heavy optimizations on the template file so I expected something better out of the box. If I had used scoped CSS instead of css-modules I could have saved a couple of KB but nothing really impressive.

Overall I achieved my goal to improve the website performances without compromising the user experience. I was able to port everything I made with React to Vue.js and tested anime.js as an alternative to GSAP.

If you’re wondering if I’m advocating Vue.js instead of React or anime.js instead of GSAP, that’s not my point. Every time you pick a stack you’re making compromises: choose the ones that hurt you (as a developer) and your project less. And finally, as this story pointed out, don’t write code based on assumptions people might advertise or advocate but be critic on those assumptions and proof test them whenever you can.

Bonus: what if WebGL…?

The problem when you clean up your house is that all the free space you just created looks so perfect for that shiny new 60 inch TV you saw yesterday at the mall.

Yep, with all that free space I saved during the rewrite I felt like I could add something cool without feeling guilty; so I implemented some small experiments in WebGL 2D with pixi.js.

I added an animated background and applied a subtle looping animation to the cover picture. pixi.js and WebGL performs very well: leveraging the GPU they don’t come in the way of other tasks like, for example, user interactions.

While I like the outcome, I found that pixi.js syntax is a bit too verbose and it forced me to keep anime.js in place even if, by converting every animation to CSS, I could have removed it from the stack.

Anyway here are again some metrics:

Lighthouse score 47?! Oh god… I’ve broken everything! Looks like the TV I bought is too large for my living room!

The issue here is obviously pixi.js beeing a very large library and the addition of some more logic to the application. I’d like to stress that adding pixi.js to the stack doesn’t just affect the overall page size but also the script parse/compile phase of the browser.

Even if an obvious solution would be to rollback to the non-pixi.js version, I wanted to experiment a bit and try to optimize it.

Since all of the pixi.js generated visuals aren’t really essential to the layout, my solution was to delay their download and initialization after the core application has started. To achieve this I used a webpack feature called code-splitting with dynamic import(). This way I was able to remove both pixi.js and its related components from the main application file and generate a separated file to be loaded asynchronously.

Achieve this result in Vue.js is a matter of a tiny change to the codebase. Instead of:

import MyComponent from './myComponent';

You write:

const MyComponent = () => 
import(/* webpackChunkName: "async" */ './myComponent');

That’s all!

Note that the comment inside the import call is totally optional and it’s used to tell webpack which name should the chunk have. If you use that name multiple times, those async components will be grouped into a single chunk.

To better understand how to use code-splitting in Vue.js I suggest you to check out this awesome talk by Sean Larkin itself.

With that simple change in place webpack generates an additional chunk file, which I called pixi, and here are the updated results:

As you can see Lighthouse metrics roughly match the static version ones. What’s changed is the JavaScript size; now I have two chunks: one for the application and another containing both the async components and pixi.js. Note that since it’s used just by the async components, the library ended up in the chunk instead of the vendors file. Another performance improvement we got for free!

Bonus 2: print for the vintage people

One side effect I experienced after applying those changes was that when I tried to print out the page on Chrome the page and its print dialog went crazy: it popped in, popped out after a second, and page layout got messy.

I just clicked on “Print…”

After a little investigation it turned out that when you issue a print command on Chrome it resizes the page to an A4 (by default) and renders it as PDF. This would be totally fine, except that it triggers all CSS breakpoints and JavaScript event handlers!

That’s a problem because in my code I use a v-resize custom directive to attach handlers to the window's resize event. This way I can control the pixi.js canvas size and re-render it whenever the window size changes.

To fix the issue I had to rely on a print flag to filter those resize-on-print events from normal resize events:

Yes… I know that code is not so cool, but at least it fixed the problem.

Conclusion (for real)

After I updated the codebase I realized there are more possible improvements I could experiment to enhance performances. For example — because I switched to HTTPS — I could leverage HTTP/2 multiplexing by splitting my application in smaller chunks. Then instead of relying just on client-side code to render the page I could add a pre-render build step to generate a static HTML. There’s a great webpack plugin which performs that task. I could even rewrite the entire app with something lightweight like Hyperapp and Preact or even wait a bit longer to enjoy the latest time slicing / suspence goodness.

Anyway, that’s all for now. It has been an interesting journey, proving — beyond all the technical facts — that we should never forget we are a sort of scientists of the web, and as such we should drop every prejudice and keep experimenting our way through the code.

--

--

Marco Solazzi

Frontend Web Developer, technical writer and speaker from Verona (Italy). Co-founder of Frontenders Verona Meetup.