Optimizing Angular application load performance

Ville Lahdenvuo
Grano
Published in
8 min readJul 12, 2018

Optimizing performance is a complex issue. You are taught that premature optimization is bad and it’s also something hard to sell to your Product Owner since many have an “if it works, don’t fix it” mentality. Here’s how we are tackling the issue.

Using automated tools to measure the impact of your changes is crucial

We’re currently working on a new E-Commerce solution for Grano, aiming to provide a better customer experience by modernizing our platform, improving automation and integration to our production services. We’re working with an enterprise-level product and environment: Node.js, Angular, Redux, AWS accompanied with high code quality and modern development standards.

Luckily our Product Owner understands the importance of performance and many big companies are making great points about performance and have done actual research to measure its importance:

We found that 53% of mobile site visits are abandoned if pages take longer than 3 seconds to load. — DoubleClick by Google

So armed with these results and the wish to improve we have been working on including performance tickets to our sprints instead of leaving them at the end of the backlog to rot.

To understand our challenges and issues you should know a bit about our application and how it works. It’s not the same to run Apache or Nginx vs. using a distributed CDN to deliver your application. You could be using server-side rendering or doing everything in the browser.

Our service runs on Amazon Web Services and we take advantage of many tools that AWS offers. The Grano Shop application consists of a Node.js API running on EC2, Angular storefront application hosted on S3 and CloudFront, Angular admin interface hosted the same way, several AWS Lambda functions and bunch of other services.

Measuring

You cannot improve something if you don’t measure it first. Sure you can make something feel faster, but to actually know you’re making the right changes is crucial.

For a long time we were running Google’s Lighthouse tool manually every now and then, but to get proper measurements the best thing is to automate it. To do this we utilised our existing continuous integration infrastructure and added a step called “audit” that runs every time we deploy to production and generate a pretty report like this:

Lighthouse report summary for Grano Shop

An important note though is that this report is using mobile device emulation and network throttling. We chose to track the mobile performance to not forget about that small percentage of users and ensure the desktop performance will be even better.

Our desktop performance score is 92 that counts for 75 % of our visits.

Another tool we use to fight the bloat is Bundlesize. Also check out BundleWatch which seems more active now. It allows us to track the size of our code bundles and warn us if we add unexpected bloat.

PR check telling that bundle size is under control. (The “-null” is supposed to be the difference to master, but it doesn’t work for some reason.)
Webpack Bundle Analyzer report

Last but not least is the Webpack Bundle Analyzer, which gives a helpful breakdown about the anatomy of your bundles. It allows you to easily see what is taking up all those precious bytes in a visual manner. It really helps to find out what to remove or lazy load to decrease the initial page load time.

This tool helped us notice an embarrassing mistake — for some reason we were including our polyfills in both the main and polyfill bundles and therefore downloading them twice for no reason!

Optimization

This section is divided into different optimization techniques and tricks we have found and exploited. Some are hopefully useful to you but some are very specific to our application so they might not directly help you, but it might give you some ideas that you can apply to your problems.

The history of our Lighthouse scores

Once we started optimizing a funny thing happened. Our performance score actually went down at first! In the first reports we saw that some audits failed to complete because they timed out! That is, once we started making it faster the audits started working and lowered our score.

TypeScript imports

We noticed that some of our dependencies were pulling in modules and code that we didn’t need. Here’s some examples:

  • A validation utility was pulling in libphonenumber-js that weighs in about 110 KB and we weren’t even using it
  • Moment.js is quite big and by default includes all locales
  • A dependency was pulling in crypto-js but using only 2 functions
  • Lodash was importing everything even though we only imported the functions we were using

Our application is written with Angular and TypeScript. This means we have a tsconfig.json file that allows us to tell TypeScript how to compile our code. One immensely useful option is paths. It allows you to add nice shortcuts for imports.

But how did this help us reduce the bundle size? Well, what we did was create proxy files for the big dependencies and load those instead. Also using lodash-es to allow tree shaking to work properly and replacing Moment with moment.min.js which doesn’t include the locale files.

Lambda@Edge

We use a AWS Lambda@Edge service to run code at the CloudFront CDN edge locations. This allows us to modify the HTTP requests but keep latency down. Talking about CloudFront, make sure you have enabled HTTP2 to allow the browser to request multiple resources using the same TCP connection and prioritise them better.

For example we render the index.html with a Lambda function and embed a JSON payload to the HTML to avoid a request to the Grano Shop API before being able to render anything.

Cache Headers

As mentioned earlier we use AWS S3 and CloudFront to host our application. By default CloudFront caches resources, but it doesn’t send HTTP cache headers to the browser.

We have another Lambda function to add a HSTS security header to the requests telling the browser to always load the site over a secure connection. We modified the function to look at the request and if the file name contains a hash before the file extension it will add a cache header with a long expiry time. Another option would have been to set the headers in S3 as metadata, but that would have required more changes to our deployment scripts.

With Angular the hashing is a simple configuration option, but keep in mind that when using the Angular CLI the assets folder is copied as-is so keep your assets in the src folder preferably next to the component that requires it to get it hashed and therefore cached properly. For example we had some vector images that were referenced in the components and once we moved them in the component folders they were hashed properly during the build process.

Lazy Loading

Talking about Angular and the CLI make sure to enable lazy loading and lazy load your routes, that way the initial payload stays smaller as your application grows more and more complex and full of features. Since the generated bundles are hashed they are cached properly thanks to the cache headers from the earlier section.

Resource Hints

Another useful browser feature is the Resource Hints. It allows you to use HTTP headers or HTML link tags to tell the browser ahead of time that it will need a specific resource. The most common example is web fonts.

In the index.html renderer Lambda we load a list of web font files from S3 and add a preload tag for each to tell the browser to download the fonts as soon as it can before downloading and parsing the CSS. This allows the browser to render the text as soon as the CSS has been processed instead of waiting for the fonts to load. Note that web fonts always require the crossorigin attribute!

One interesting case in which this applies even if the fetch is not cross-origin is font files. Because of various reasons, these have to be fetched using anonymous mode CORS (see Font fetching requirements if you are interested in all the details). — MDN

Here you can see that the fonts are downloaded immediately and the hashed resources are cached.

Icons

We use Font Awesome icons on our site and we were using the version 4 using a web font. To reduce the load time we upgraded to Font Awesome 5 and started using the library provided for Angular to load the icons. This way our build system only embeds the required icons into our JavaScript bundle instead of making it download all icons like we did earlier by using the web font.

Future ideas

As you saw in the Lighthouse report we still have plenty of work to do and we have some ideas we want to look into.

Better lazy loading

We can lazy load more. We added lazy loading afterwards , that is, we didn’t have it when we started building the application and now we have to organise the application and move things around to be more lazy.

Using a Service worker

A servicer worker won’t help with the initial page load, but with it you can control the browser cache totally and make sure the application works offline, in our case we could allow you to browse products and prepare your order even if you’re offline.

Smaller Polyfills

We are currently loading polyfills to support IE11. However most of our users use modern browsers that have built-in support for the things we need. There are ways to download only necessary polyfills for example checking the browser user agent string. There are even services like Polyfill.io that do it for you.

The takeaway

Performance is an uphill battle, but one that is always worth it. Investing in performance and using the right tools can make a big difference in the user experience. Please 👏 and 💬 if you found this interesting and would like a second part!

  • Keep performance in mind when developing — perhaps you don’t need all those 5000 npm modules
  • Use the right tooling and measure everything
  • Take advantage of new browser features such as HTTP2 and Resource Hints
P.S. We’re hiring!

Ville works as a Lead Developer at Grano, the most versatile content service provider in Finland and the leading graphic industry company in the Nordics. Currently he is working on Grano’s E-Commerce solutions.

--

--