Saving 10 Bora Bora Islands a Day with Image Optimisation

By optimising the Google logo images with proper image compression, a rainforest the size of 1o times a Bora Bora island (30 km2) can be saved daily.

When adopting the principles for designing for User Experience you consider every superfluous bit that is being send over a (mobile) internet connection as a loss and bad impact on performance and therefore UX.

A few years ago I discovered that the one of the most downloaded images on the web lacks good optimisation, with other words could do with smaller file size.

The Google logo, and the image sprite that is being used on the Google domains can be served with better optimisation (PNG compression) and therefore less bytes. I was wondering what kind of impact this could have on wasted energy with the amount of visitors Google receives. So please bare with me..

Compression

There is a smart way of compressing PNG’s (lossless vs lossy) that’s not yet adopted a lot. Even webdesign tools as Sketch and Adobe Photoshop are still lacking to implement this technique.

This technique, based on quantization, uses smart lossy compression techniques to reduce the file size of your PNG files. By selectively decreasing the number of colors in the image, fewer bytes are required to store the data. The effect is (not or) nearly invisible but it makes a very large difference in file size. By reducing the number of colors, 24-bit PNG files can be converted to much smaller 8-bit indexed color images. All unnecessary metadata is stripped too. The result: better PNG files with 100% support for transparency.

As you can imagine there are a lot of superfluous bytes spent on suboptimal image compression, not only for the Google images. But for the sake of the investigation, we will keep the context narrowed down to these images.

So, we are focusing on the logo image (shown above) and the sprite;

When we optimise these images using the quantisation we see a 65% of bytes saved for the sprite image and 51% for the logo image (I have used tinypng.com for this).

tinypng.com
logo image — 51%
sprite image — 65%

This results to: 30 + 40 kb = 70kb. So far so good, that wasn’t too difficult. Here is an example of the original and the compressed version:

Original
Compressed

Do you spot any difference? Colors are the same, file size differs. (Right-click to ‘Inspect’ element). Now the more tricky parts..

Calculations

So far I came up with the following calculations. I’d love to hear if you could refute where I miscalculated stuff or where my assumptions were bollocks.

# 1 : Number of downloads per day

To find the amount of times these images are downloaded from the web server (not the browser cache) we have to find out how many unique visits the search engine retrieves per month. This is where it gets kinda tricky, you don’t exactly know how many times an image is served from the cache and when its being refreshed. For this we will use the ‘max-age’ of the served image (cache-control; 8.76h) in combination with the unique amount of visitors.

As it appears Google search engine’s popularity is spread over several domains. We will only address the 10 most popular domains in this research to get an idea of the amount of unique visitors.

If we simply sum up the top visited domains such as;

  • google.com : 15.325.000.000
  • google.fr : 1.495.000.000
  • google.de : 1.226.000.000
  • google.co.uk : 964.500.000
  • google.co.jp : 752.000.000

and with other domains that represent at least 20.000.000.000 unique visitors for november, we’ll have a respectful number of unique visitors with at least 670.000.000 unique visitors per day (divided by 30).

# 2 : Number of bytes wasted

Next up is calculating the number of extra bytes wasted every day. This one is simple. We take the 670M and we multiple it by the number of wasted bytes (32436).

670.000.000 * 32436 / 1000 = 21732120000 KB = 21700 GB

This boils down to 21TB (Terabytes) a day. If you compare this to the global numbers it is not that impressive:

By the end of 2016, global IP traffic will reach 1.1 ZB per year, or 88.7 EB per month, and by 2020 global IP traffic will reach 2.3 ZB per year, or 194 EB per month. ~ cisco.com

But if you could take into account that at least 1% could be saved with optimisation we are still making a strong point. Already, data centres have mushroomed from virtually nothing 10 years ago to consuming about 3 per cent of the global electricity supply and accounting for about 2 per cent of total greenhouse gas emissions. That gives it the same carbon footprint as the airline industry (according to Independent UK). So let’s continue and keep focusing on the logo.

# 3 : Wasted bandwidth in terms of energy

This might be the hardest part of the calculations, converting the waste in bandwidth into energy. There are a lot of views and articles on this subject that all have different ideas about it. To be on the safe side, we will stick to an older research that calculated that 1 GB accounts for 13kWh of energy. Since the factor of spent energy did only increase exponentially over the last few years, this should be a conservative number.

So, in terms of energy, what are we talking about?
21700GB * 13kWh = 282100 kWh

# 4 : Wasted energy in terms of rainforest

For this we are relying on the calculations of the United States Environmental Protection Agency. They have a calculator where we can put in our kWh and find the corresponding effect on (US) forrest.

In the results we then find;

188 acres of forest need one year to sequester the amount of related carbon, that means you need 188 * 365 = 68620 acres (277 km2) to make up to our energy waste per day. This is the equivalent of 10 Bora Bora islands.

Final word

To say the least, I am convinced my calculations are not a 100% trustworthy since a lot of investigations and numbers are contradictory and questionable.

The sake of this article was to prove how we, as web engineers, in some occasions can have great impact on energy loss building for the web. Besides that, optimisation has a huge impact on User Experience. Enough proof to start better optimisation today, perhaps starting with Google’s logo.

— disclaimer

Having said, Google might be the forerunner for improving and optimising the web, so they already do a lot on wasted bytes in general, one of them being a new image format webP. But they are a fair starting point for optimisation if you take a look at the amount of servers they run.

Optimise the web, improve performance, 
save a rainforest and delight your users!

Sources

https://googleblog.blogspot.nl/2009/01/powering-google-search.html

https://theoverspill.wordpress.com/2015/10/19/searches-average-mobile-google-problem/

https://en.wikipedia.org/wiki/List_of_Google_domains

http://www.similarweb.com/website/google.com

http://alistapart.com/article/sustainable-web-design

http://www.mnn.com/green-tech/computers/stories/your-virtual-carbon-footprint-may-be-bigger-than-you-think

http://evanmills.lbl.gov/commentary/docs/carbonemissions.pdf

http://www.theguardian.com/environment/green-living-blog/2010/oct/21/carbon-footprint-email

https://www.google.com/analytics/optimize/capabilities/

http://www.npo.nl/keuringsdienst-van-waarde/31-03-2011/NPS_1175158

http://designingforperformance.com/performance-is-ux/

http://store.saveyourworld.com/Preserve-The-Rainforest-s/34.htm

http://www.rainforestconservation.org/rainforest-primer/rainforest-primer-table-of-contents/k-rainforest-role-in-climate/

https://en.wikipedia.org/wiki/Bora_Bora

https://en.wikipedia.org/wiki/List_of_Caribbean_islands_by_area

https://en.wikipedia.org/wiki/Crooked_Island,_Bahamas

http://gizmodo.com/5517041/googles-insane-number-of-servers-visualized

http://www.co2logic.com/home.aspx/nl/news/bomen+planten+klimaat+CO2+compensatie.html