Debugging Ember-cli Build Times

Brian Runnells
4 min readDec 11, 2015

--

Earlier this summer we noticed the build times on our Ember.js app had gotten, well, slow. Our deployment process would get to the ember build step, which would then take upwards of 2–3 minutes to complete. This is not a pressing problem if you are deploying infrequently, but at Heroku we are building and deploying dozens of times a day as part or our development process using Heroku’s indispensable Review App feature. This was a real bottleneck in our feedback loop.

The problem here though, besides the slow speed, was the lack of information around what exactly ember build was spending its time doing. It wasn’t until a few weeks ago that I was able to address this; the results of my efforts were surprising and seemed worth sharing.

Debug Tools

The real breakthrough came when I saw this pull request in the ember-cli repo that outlined steps one could take to debug build problems.

There is a lot of good information in the PR, but the tool I found to provide quick results was broccoli-viz. To use it you can follow these steps:

// 1.  install graphviz
brew install graphviz
// 2. generate visualization data of a production build
BROCCOLI_VIZ=true ember build --environment=production
// 3. generate a graph based on the data from the previous step
dot -Tpng graph.<version>.dot > out.png

The result of this process is a very large graph which visualizes the work of the ember build.

Visualization of an ember build

The amount of data can be overwhelming, but my eye was quickly drawn to one of the final steps at the bottom of the visualization.

For some reason our build was spending a whopping 90 seconds in the AssetRewrite step. Something was not right.

Finding The Bottleneck

The next step was to determine where the asset rewrite step was spending its time. The easiest way to do so was to drill down in my project’s node_modules folder until I got to the module I was interested in. In this case that was broccoli-asset-rewrite, the library responsible for rewriting fingerprint information in our ember builds.

To start, I utilized node’s time/timeEnd helpers to get a high level view of which files the asset rewrite step was spending its time on.

/node_modules/broccoli-asset-rev/node_modules/broccoli-asset-rewrite/index.js

Running ember build — environment=production at this point yielded some interesting results:

assets/app-27a814a2bec8a8b1b2de7f84608fd1db.css: 7048ms
assets/app-ea7566323b2c73a234fcb0174fbf8014.js: 41971ms
assets/vendor-5ca1d4f8c0663308018088753e1c9024.css: 142ms
assets/vendor-fceb84c3ddd7df85f756a2727ba5d329.js: 29400ms

This timing information confirmed that rewriting these files was slow but didn’t identify why. To get a better sense for where each individual source file was spending its time I added another pair of timers inside the for loop that iterates over assetMapKeys — the list of individual files that were fingerprinted and need to be rewritten into our source files.

/node_modules/broccoli-asset-rev/node_modules/broccoli-asset-rewrite/index.js

This yielded a very long list of individual files being processed as part of the larger asset rewrite.

images/sprite/1x/activity-small-config-var-remove.png: 19ms
images/sprite/1x/activity-small-initial-release.png: 18ms
images/sprite/2x/activity-small-initial-release.png: 50ms
...

The vast majority of the files being iterated through were icons we use to generate our sprite image. This is when the lightbulb went off.

The Problem

We use ember-sprite to generate our sprite, and one of the things it does (because it has to) is copy all our source icons to the /dist folder during our build so that it can use them to generate our sprite. A side effect of this is that all these files are then fingerprinted along with the rest of our assets even though we won’t actually be serving them to users. Because each file gets fingerprinted, each one is added to the asset map used by broccoli-asset-rev causing it to iterate over each one as it rewrites our source files with broccoli-asset-rewrite to point to fingerprinted assets.

With a small number of source icons this might not be a noticeable issue, but since we have ~500 icons between our retina/non-retina sprite the ~80ms penalty per file added up to create a painful bottleneck.

The Fix

As is often the case, the fix was actually quite easy. In our ember-cli-build.js file I explicitly excluded the folder that contains our sprite source icons like this:

fingerprint: {
exclude: [‘images/sprite’]
}

Now when I reran the production build I saw a big difference in time spent in asset rewrite for each source file:

assets/app-466cad4b02539ee10116c32084f6689b.js: 943ms
assets/app-cd807f9bb7282df53cad45783254d773.css: 168ms
assets/vendor-5ca1d4f8c0663308018088753e1c9024.css: 4ms
assets/vendor-fc9e083f2b1cc5f5a20a45658e81e97b.js: 645ms

That’s a ~98% reduction in time spent in asset rewrite from changing one line of configuration!

The Takeaway

While the underlying issue in this case was related to a very specific scenario, the broad approach for dealing with slowness in an ember build should be helpful for debugging any number of issues.

--

--