Sustainable Web Design — How to reduce the carbon footprint of your website

António Silva
tb.lx insider
Published in
9 min readAug 13, 2020

Code splitting and image optimization can reduce your website’s impact on the planet

Sustainability became a top priority for many governments, companies, and individuals, as concerns for life as we know it arise. As a tech company whose vision is To Live in a World of Sustainable and Connected Transportation, we are constantly asking ourselves how we can effectively implement a sustainable approach towards our products, our people and our technology.

When it comes to our daily interactions with the web, the average person isn’t aware that every click you make, every file you download, every standby light you leave on, consumes energy and produces greenhouse gases.

So, in the name of sustainability, our Frontend Team took up the challenge of reducing the carbon footprint of our website (https://tblx.io/). And because sharing is caring, here are our key takeaways, so that you can also decrease the impact of your website on the planet.

A Screenshot of our website tblx.io

Current stack

Our website was bootstrapped with the Create React App, using the latest React version. We use react-router to manage routing, it is client-side rendered, runs an Express server under the hood, and is hosted on Azure.

Where we saw the problem

At our company, we have a continuous improvement mindset. So, a few months back, we had the idea of redesigning our website to make it more complete, detailed, and appealing.

One of the major changes was the addition of a considerable amount of new graphic assets, which, left unoptimized, would make the website very sluggish and inefficient. This would be a recipe for disaster regarding performance, user experience, and energy consumption, thus carbon footprint.

The solution

Since our carbon footprint is a very important issue for us, as a company and as individuals, we decided to dedicate some time to analyze different solutions to make our website as light and efficient as possible. The optimization of our media assets was an obvious and mandatory step, that’s both simple and very effective; the other solution we chose was code-splitting, which will be described below.

1. Code Splitting

Before we implemented code splitting, webpack, an open-source JavaScript module bundler, generated one single large bundle that included every page and resource of the website, even if it was not necessary. This increases the time a website takes to load, wasting energy and bandwidth.

This is where dynamic imports come into play - and React.lazy() simplifies their use.

“React.lazy() takes a function as its argument that must return a promise by calling import() to load the component. The returned Promise resolves to a module with a default export containing the React component.” -https://blog.logrocket.com/lazy-loading-components-in-react-16-6-6cea535c0b52/

By adding this functionality, webpack creates several smaller chunks, that will only be loaded if needed.

In our case, we added “lazy-loading” to both the routes and the components. This way, we have separate chunks for each page and each component, or block of components.
Since the content will now be loaded on demand, we added the Suspense component to provide a fallback, our spinner animation, while the required chunks are loaded and rendered.

Below you can see how simple it is to adapt your existing code:

Before:

import { Quote } from ‘../components/quote’<Quote (…) />

After:

const Quote = React.lazy(() => import(‘../components/quote’)<Suspense fallback={<LoadingAnimation />}>  <Quote (…) /></Suspense>

You can find a detailed explanation, including how to configure webpack for code-splitting here.

2. Image Optimization

Image and video are some of the heaviest resources you can have, so their impact on your website’s performance is significant. That’s why it’s paramount to optimize them and rethink their usage. If it doesn’t bring value, remove it! The users and the planet will thank you.

The size of our website’s media folder was over 36MB(!). Now, it has 2.8MB, with a whopping figure of 96 images, including their WebP and LQIP (Low Quality Image Placeholder) versions.

If you are wondering how we managed to achieve a 92% reduction in the overall size, here’s how:

Next-generation image formats

WebP is an image format that’s lighter than the regular ones, like JPEG or PNG. It can be used in most modern browsers, and if you have to support older browsers, just include a fallback image tag in one of the universally supported formats.

We found that the simplest way to convert the images to this type was to install the webp package. If you use brew, just run “brew install webp”. After the installation is complete, you can start converting your images with the following bash commands:

cwebp images/flower.jpg -o images/flower.webp

You can either do it for every image (see above) or by directory, and it iterates through every single item alone (see below), how awesome is that!?

for file in images/*; do cwebp "$file" -o "${file%.*}.webp"; done

Additionally, it allows you to choose the quality percentage you want, possibly, making it even lighter.

cwebp -q 50 images/flower.jpg -o images/flower.webp

The command below will iterate through every image in the specified directory and convert them with a quality level of 50%:

for file in images/*; do cwebp -q 50 "$file" -o "${file%.*}.webp"; done

The implementation is also pretty simple! The code will look something like this:

<picture>
<source type=”image/webp” srcset=”flower.webp”>
<img src=”flower.jpg” />
</picture>

Note: the <img> tag is the fallback, and it will only be rendered if the other types are not supported by the browser.

With this step, we ended up adding more images to the project, because it creates a .webp version for every image we decide on, but it’s totally worth it since the images are lighter than their jpeg/png counterparts and therefore the website loads and runs faster.

You can learn more about this here.

Image file size and dimensions reduction

One of the best ways to significantly cut the images’ file size is to resize their dimensions depending on their intended usage. For example, if you have an image with a 1920*1080 resolution, but it will be used in a 200*200 pixels container, you can resize it accordingly and save a lot of space.

At the same time, since it’s lighter, it will be loaded quicker, which means that your website’s performance, bandwidth usage, and user experience will improve. Also, you can use tinypng to reduce your recently resized image’s size even more.

Note: If you see that the resized image’s quality is not good enough, using the example above: instead of resizing it to 200*200, try 400*400, and so on. Just remember to use tinypng, or another similar tool, afterwards.
Bear in mind, though, that you can use its “magic” several times on the same image, but it will reach a point where you’ll start to notice a significant loss of quality, so you should try to find the best balance between quality and size. With heavy images, you’ll probably be able to reduce the size by more than 70% on the first iteration, with basically no loss in perceived quality.

“On demand” lazy-loading

We use react-slick for our two sliders, and all the images were being “lazy-loaded” already, but progressively, which means they were loaded one after the other, even if they weren’t “called” to the viewport, and this causes lag.

With “on demand” lazy-loading, the images are only loaded when they’re rendered on the viewport. For example, imagine you have an animated slider with 3 images. The second image will only be loaded when the transition occurs, or when the user clicks the arrows. If the user leaves your website or navigates to another page, the remaining image is never loaded.

This is a significant performance and user experience improvement on any website, especially if it has a lot of users accessing on their mobile devices with mobile data.

In addition to this, we used lazysizes (you can also check the new native image lazy-loading here) to implement the LQIP feature that allows us to show a low-quality version of an image as a placeholder, while the “real” one loads, creating a first impression for the user and reserving the viewport size needed. You can get this low-quality version by simply resizing the original image to 70% of its pixel size, like explained above. We used the simple image resizer tool for this, but you can use the one you prefer.

Further implementations

CDN

It’s very simple to understand the concept that the closer you are to the hosting server, the faster you get the resources needed, but if you have a global target group, that’s easier said than done. This is the goal of CDNs.

A content delivery network (CDN) consists of a group of servers, spread around the world, that cache a website’s assets, like images, for example, and serve them to the users closer to them. This way, the assets are delivered faster, improving the website’s performance and user experience, and the hosting server’s load and bandwidth usage is reduced.

Improve SEO

“When optimizing a website for search engine rankings, we are helping people find the information they want quickly and easily. When SEO is successful, it results in people spending less time browsing the web looking for information, and visiting less pages that don’t meet their needs. This means that less energy is consumed and the energy that is consumed is used to deliver real value to the user.” — https://www.wholegraindigital.com/blog/website-energy-efficiency/

If you don’t need SEO, you can also block all bots. This way, they won’t crawl your website and don’t waste resources indexing it.

In our case, we already made some improvements regarding SEO and accessibility, but there are still some tweaks we can and will definitely do.

Tools we used (all free):

Lighthouse

There’s no one better to describe it than its creators:

Screenshot of the example of an audit on our homepage, concerning performance, accessibility, best practices, and SEO

“Lighthouse is an open-source, automated tool for improving the quality of web pages. It has audits for performance, accessibility, progressive web apps, SEO and more. You can run it in Chrome DevTools, from the command line, or as a Node module.” — https://developers.google.com/web/tools/lighthouse/

This tool gives you some tips on how to improve your website and allows you to compare the results from before and after the changes since it stores the audits for the session. If you follow the “learn more” link in front of every suggestion, you’ll be forwarded to a much more detailed page, which sometimes has a step-by-step guide on how to implement the improvements.

You can see an example of an audit on our homepage, concerning performance, accessibility, best practices, and SEO in the picture above.

Website Carbon Calculator

“How is your website impacting the planet?” — https://www.websitecarbon.com/

From their powerful and straightforward headline, you can probably guess what the purpose of this project is. It allows you to estimate the carbon footprint of each page of your website by providing the URL. It also presents you with tips on how to improve its performance, and you can even add their badge to your page, showing its impact on the planet.

Hurrah! This web page is cleaner than 75% of the web pages tested

Tinypng

This free magical tool comes in handy when you have some heavy png/jpeg and want to reduce their size. But if you think your image is already light and doesn’t need improvement, give it a try, you might be surprised!

It works “(…) By selectively decreasing the number of colors in the image, fewer bytes are required to store the data. The effect is nearly invisible but it makes a very large difference in file size!” — https://tinypng.com/

Simple Image Resizer

This tool is also free and allows you to resize your images, among other operations, either by percentage or specific dimensions. The only downside is you have to import images one by one.

http://www.simpleimageresizer.com/

Final note

With these small improvements, we were able to reduce the carbon footprint of our website. Now every time someone clicks on https://tblx.io/, fewer greenhouse gases will be produced. Small steps are important towards a more sustainable future, and everyone can take action.

So always remember when coding: processing power = energy usage = greenhouse gases.

How did these tips work for you? Do you have more ideas on how to improve the carbon footprint of a website? Then, feel free to share your tips and thoughts.

Acknowledgements

I want to thank my colleagues António Freire and Benjamim Alves, for this great team effort and for reviewing and contributing to this article.

References

[1] https://www.wholegraindigital.com/blog/website-energy-efficiency/

[2] https://web.dev/use-lazysizes-to-lazyload-images/

[3] https://blog.logrocket.com/lazy-loading-components-in-react-16-6-6cea535c0b52/

[4] https://reactjs.org/docs/code-splitting.html

[5] https://web.dev/serve-images-webp/

[6] https://tinypng.com/

[7] https://github.com/aFarkas/lazysizes

[8] https://web.dev/native-lazy-loading/

[9] http://www.simpleimageresizer.com/

António Silva works as a Frontend Engineer for tb.lx in Lisbon, Portugal.

--

--