Scalable Static Ecommerce with Hugo

Chris Marshall
Dec 12, 2017 · 11 min read

Can Static Site Generators offer a viable solution for online stores at scale?

Static site generators (SSGs) have been getting a lot of attention over the past few years. As someone who started building websites in the late 90’s, the idea of a return to pure, raw web building has a certain nostalgic appeal (let’s just forget the horrible frames, rollover gifs, server-side includes and tables eh?).

But it’s not nostalgia that’s driving the uptake in interest. SSGs have been positioned as a simpler alternative to database-driven CMS’s such as Wordpress and Drupal for small, content-driven websites. There are indications that the idea can work for larger websites — being perhaps the most notable large site yet to go static.


The pros of static sites are well-documented, but by way of a summary:

  • Performance: no application, no database, all pre-rendered HTML. Time to first byte (TTFB) is minimal — static sites are like one lovely, big, warm cache.
  • Scalability: by the same token, the reduced server load and the ability to serve the entire site from a CDN means that spikes in traffic can be easily managed. Fewer overworked CPUs, no database bottlenecks, simpler and cheaper infrastructure.
  • Security: there’s nothing beyond view source.

These are all good reasons to use an SSG for your content site. But — for someone who has worked exclusively in ecommerce for the past 10 years — these sound like amazing reasons to use static site generators for online shops.

There are plenty of tutorials and guides out there showing how to use SSGs for basic ecommerce stores — with a few, simple products. The prevailing view, however, is that SSGs are not suited to larger, complex, dynamic online stores. Is that so? Considering this led me to think about how one could develop a scalable, static ecommerce architecture (henceforth referred to as SSEA).


Now, not all online stores have the same requirements. I am not suggesting for one second that Amazon should consider going static. Nor any store with a large catalogue of tens of thousands of products.

Most stores however, are not nearly so large. We work with a number of well-known fashion brands. They usually have a few hundred products — one or two thousand at most. I did some research beyond our own portfolio and found it was the same case with many popular brands (note here that I’m talking about products — once variations such as sizes etc. are taken into account, 1,000 products can translate into, say, 5,000 SKUs).

And do you know what? These sites are not that dynamic. New products tend to appear seasonally. Categories are largely fixed. In fact the only time a product detail page (PDP) generally needs to change is if a product goes out of stock or its price changes — and these things don’t happen that often.

If a simple product (one with no size/colour variations) goes out of stock, the add to bag button needs disabling and — optionally — it needs removing from the product list. For a configurable product the size/colour/whatever option that has gone out of stock needs removing from whatever UI allows the user to choose.

Dynamic features

But what about dynamic features such as recommendations and search? In a surprising amount of cases — even on powerful, popular Enterprise platforms such as Magento — these features are delivered asynchronously by 3rd party technologies. Nosto might deliver personalised product recommendations. Algolia might power the search. Optimisation platforms such as Optimizely work in a similar way — static sites can easily integrate these technologies.

Shopping cart

Cart functionality can also be integrated in a similar way. ‘Headless’ platforms such as moltin and JS-based carts such as Snipcart can bring promotions, checkout, payments, shipping, user authentication and all the other functionality required to offer a full transactional experience. And don’t be fooled into thinking that you’d be trading away flexibility either. These systems are capable, extendable and can integrate with 3rd party ERP/WMS/OMS systems as required — just like enterprise platforms that costs thousands per month and hundreds of thousands to build.

Putting it together

So we have the functionality we need to deliver a transactional site. How do we build it and how do we keep it up-to-date? How do we reflect the changes in stock described above?

In order to do this, our SSEA must meet certain criteria:

  • Speed: it should be able to generate a site build quickly. Even though changes to files may not be that frequent, when it does happen changes need to be reflected on the live site ASAP so as to avoid shopper frustration
  • Data: it must have a flexible data model that can support product information such as product details, stock levels and prices, but also content pages, blogs and menu systems.


When it comes to SSGs that run at speed, Hugo is the last word. I’d read about this and was keen to put it to the test. How long would it take to generate a sites-worth of product detail pages?

To measure this, I created some sample product data based on one of our clients’ sites. This site features content-rich PDPs, with content blocks dynamically included on the page based on the product’s attributes. On the site this is achieved through some Magento/Wordpress integration jiggery-pokery, but I found it was quite simple to replicate this using Hugo’s templating language.

Pulling through the product data into the template is simple enough:

Fitting: <strong>{{ .Params.fitting }}</strong> </br>
Last: <strong>{{ .Params.last }}</strong> </br>
English size: <strong>{{ .Params.uksize }}</strong> </br>
Sole: <strong>{{ .Params.sole }}</strong>

Adding logic to pull in includes/partials dynamically is a little more involved, but even a lapsed coder like myself I can manage it:

{{ $f1path := (print "shoefeatures/" $.Params.feature1 ".html") }}
{{ partial $f1path}}
{{ $f2path := (print "shoefeatures/" $.Params.feature2 ".html") }}
{{ partial $f2path}}
{{ $f3path := (print "shoefeatures/" $.Params.feature3 ".html") }}
{{ partial $f3path}}
{{ $f4path := (print "shoefeatures/" $.Params.feature4 ".html") }}
{{ partial $f4path}}

All we’re doing here is building paths to partials (includes) to bring in content in a (semi) dynamic fashion.


So — fairly involved, but how does it perform? I created 1,000 product content files (duplicates) — these are JSON files containing name, price, attributes, variant data etc. as ‘front matter’. These are piped into the templates and partials that transform them into HTML pages.

In addition to the 1000 product pages, Hugo also generated 50 list pages (20 products per page), and index page and an xml sitemap.

When I ran the Hugo build job, it knocked these out in less than a second — typically 720ms on a Macbook Pro to be precise:

Built site for language en:
0 draft content
0 future content
0 expired content
1000 regular pages created
8 other pages created
0 non-page files copied
50 paginator pages created
0 tags created
0 categories created
total in 724 ms

I found this very impressive. Though there are some caveats:

  • Complexity — whilst there are complications around these templates, the HTML itself is very basic — certainly not enough to deliver the kind of UI linked to above. That said, I did increase the complexity of the HTML iteratively whilst experimenting with this and there was a fairly minimal increase with each change.
  • Assets — unlike other SSGs, Hugo does not generate and compile CSS and JS etc. For this you’d need to use Gulp, Grunt or similar. This is fine for our purposes here as development deploys would not happen very often: most updates will be data such as content, inventory or pricing.

That said, these generation times are well within what I would consider tolerable. Even if we doubled the catalogue size and added two additional languages, it would still come in less than 5 seconds.


Now the performance box is well and truly ticked, what about data? The product data model is no problem, nor should be any editorial or blog type content that we may wish to add: this stuff is meat and drink to Hugo. There is also native support for hierarchical menu systems.

What about the more ecommerce-type data: pricing, for example or inventory updates? We may wish to push the information up to the site as it may result in changes to the HTML.

Consider the scenario discussed earlier: if a size goes out of stock, this needs reflecting on the product page. The dropdown or swatch needs amending. In your typical ecommerce setup this information would be pushed in from an ERP system.

Ideally, we’d want to only send pricing or stock information up: we don’t want to push all the product data each time. Hugo has a solution for this — data files.

The sample products I’m using are shoes. In the front matter data for each I’d set up details of size variants as follows:

"variants": [
"title": "6",
"sku": "9000-6"
"title": "7",
"sku": "9000-7"
"title": "8",
"sku": "9000-8"
"title": "9",
"sku": "9000-9"

Hugo’s data files functionality allows you to store additional data that is pulled in when generating the site. In my scenario, I needed to create price files and inventory files. These files could be generated and pushed in by an ERP system. I created a simple inventory file with stock levels for individual SKUs stored in the following JSON:

"stock": {
"9000-1": {
"allocation": 3
"9000-2": {
"allocation": 9
"9000-3": {
"allocation": 3
"9000-4": {
"allocation": 1

In total I have 6,000 records: 6 sizes for each of the 1,000 products, so a fairly sizeable file. I put this in the location data > inventory > gb.json (in Hugo, data files live under the ‘data’ directory).

In the product page template I found I could access the ‘allocation’ value for each SKU using the following:

{{ range .Params.variants }}
<li>{{ .sku }}, {{ (index $ .sku).allocation }}</li>
{{ end }}

This pulls through the inventory level next to each size. Not precisely how you’d do it on the front end of a store, but you could do something similar and build a UI from that data.

So 6,000 additional pieces of data to lookup and display in the templates. Surely this would push the generation time out a lot? Well, it did a bit:

Built site for language en:
0 draft content
0 future content
0 expired content
1000 regular pages created
8 other pages created
0 non-page files copied
50 paginator pages created
0 tags created
0 categories created
total in 826 ms

…but this is still workable, still well under a second for a 1000 page site.

Sample architecture

So it seems that the building blocks for developing a Scalable Static Ecommerce Architecture are there. A simplified system overview could look as follows:

ERP — system of record for orders, stock and pricing. Would send tock and price updates to the ecommerce platform and also to Hugo.

CMS — such as Drupal, or possibly a headless solution such as Contentful or Directus. Would be used to provide a friendly UI for authoring, and storing enriched product data, editorial and static content.

Hugo — combined with Grunt, Gulp or similar to manage asset creation.

Ecommerce platform — the precise flow here would vary depending on the capabilities of the platform. If the platform could trigger the generation of stock or price updates when they changed, then the ERP would probably not need to talk to Hugo.


For maximum efficiency, only updated files should be deployed after a change. Most changes to inventory levels would not result in a change to the HTML — if the stock level of an item goes from 6 to 3 then nothing changes on the front end. If the stock level goes to 0, or from 0 to any positive integer, then a change will be triggered. It is only these files that need to be deployed.

Changes to prices and content, as well as development deployments would always trigger a change, but these would happen less frequently.

For the purposes of my proof of concept, I tried out Netlify to see if it could manage this. I was heartened to discover that this is how Netlify manages deployments by default.

By way of a test I updated the inventory levels for a handful of products and pushed the updated file to Git. Netlify duly ran the Hugo build process, identified the changed files and uploaded only those. The whole process took around 20 seconds:

Built site for language en:
0 draft content
0 future content
0 expired content
1000 regular pages created
8 other pages created
0 non-page files copied
50 paginator pages created
0 tags created
0 categories created
total in 834 ms
9:10:18 PM: Build complete: exit code: 0
9:10:19 PM: Cleaning up docker container
9:10:19 PM: Starting to deploy site from 'public'
9:10:19 PM: Deploying to CDN
9:10:22 PM: Uploading 5 files
9:10:22 PM: Uploading file product/cadogan-tobacco-calf-16/index.html
9:10:22 PM: Uploading file product/cadogan-tobacco-calf-340/index.html
9:10:22 PM: Uploading file product/cadogan-tobacco-calf-1/index.html
9:10:22 PM: Uploading file product/cadogan-tobacco-calf-267/index.html
9:10:22 PM: Uploading file product/cadogan-tobacco-calf-102/index.html
9:10:23 PM: Starting post processing
9:10:28 PM: Finished uploading cache in 473.751738ms
9:10:28 PM: Post processing done
9:10:28 PM: Site is live
9:10:28 PM: Finished processing build request in 20.097144616s

To be resolved

So we’re getting there, but there are areas here yet to be fully answered, to name but a few:


Services such as Netlify manage code deployments through Git. In the setup I’m proposing here, there would be a new commit for every update to stock levels, which may or may not be ideal. Development and content could run as separate branches and merge back in when required, but it might get messy.

Delta changes

Inventory updates usually run as a full update overnight, with deltas applied through the day to reflect stock movements in warehouses. Here, we’ve only looked at the former — I’m not sure how we could handle the latter.

Preview and staging

This may present a problem, especially when you have developers, content editors and automated processes update the code.

Moving it forward

This is as far as I’ve got, but there’s enough here to make me think that this is possible, and offers enough advantages to be of interest. This goes especially for the medium-sized brand stores I mentioned earlier — reasonably-sized product catalogues and a fairly complex content requirement.

On top of the security, performance and scalability factors mentioned earlier, I would add:

Content management: Hugo can support many different types of content. Ecommerce platforms are generally lousy at handling content. At a time when retailers and brands are putting more and more emphasis on integrated content, this is a definite plus.

Flexibility: the data models in Hugo are very flexible, and the templating language offers enough power to handle many different scenarios — I’ve only scraped the surface of what is possible here.

Simplicity: store owners are often beholden to developers with knowledge of the arcane ecommerce platforms upon which they run. This setup offers a much lower barrier to entry — knowledge of the data model and the relatively simple templating language is all you really need to get going.

I will continue to develop this proof of concept as time permits, but I’d be very interested to hear from anyone who has attempted/considered/written off the approach described above.

Chris Marshall

Written by

Director at Onstate. Father of two, husband of one.