A better and leaner internet

Micael Nussbaumer
ImageEngine
Published in
9 min readApr 30, 2021

Content delivery and asset optimisation

Disclosure

When it comes to the internet I’ve always had a certain aversion towards bloated websites. Not sure if it’s the fact that I’ve studied photography and worked for a couple years as an image editor, while “nerding” out on image optimisations, or the fact that when I was fourteen or so years old I had to use a dial-up modem to connect to the then, new, internet.

I still remember the buzzing and beeps the modem emitted while its front panel lit with coloured circles dancing to its noisy tune.

It had a whooping 56kbs/s connection, over a shared telephone line. Connecting to the internet meant you couldn’t use your phone at the same time — you would yell from your room letting everybody in the house know it was internet time — the original form of multicast.

Some things haven’t changed since then. You would use a browser, Netscape, and type addresses on the internet to visit. There were search engines, but no Google yet. There were forums, applications for messaging and (m)IRC for talking with people in real time.

Websites had to be small. People got creative on how to cram the most functionality, styling and interesting content in the smallest possible payload so it would load in a reasonable time. Applications for downloading files were discriminated on the basis of having or not having the ability to resume downloads — given how regularly you would lose a download due to connection issues somewhere between your computer and wherever the file you wanted was.

Other things have definitely changed. Today a country like Afghanistan has an average mobile internet speed of 2.9Mbits/s. That modem, top-of-the-shelf consumer grade, had 0.056/Mbits/s. When you look at fixed broadband the minimum is 30Mbits/s up to 240Mbits.

We can now download huge files almost instantaneously, and while that is good it also masks extreme inefficiencies when it comes to assets we access online.

To add to that, we no longer hear the continuous beeping of a modem reminding us it’s connecting, and the blinking lights telling us it’s exchanging information with some other computer somewhere else. Instead we see an icon on our devices that at most shows us the quality of our wifi signal.

It fades into the background, the things we see on our device screens seem to materialise out of thin air. I guess that’s why I was interested in writing about this topic — all of it concerns the web, and that’s why I accepted when Jon Arne Sæterås, Vice President of Product at ImageEngine asked me to write about ImageEngine. Even though this article is sponsored the ideas in it are mine and I try to apply them in my day-to-day work.

What is the size of a microsecond? Or a bit?

On Carmen Mitchell’s dissertation, “The Contributions of Grace Murray Hopper To Computer Science and Computer Education”, she tells of how Hopper would carry, to her talks, 11.8inch (roughly 30cm) pieces of wire to distribute to her audiences. She used it to illustrate the maximum distance electricity could travel in a billionth of a second, in order to explain how computer circuits worked. One week later from that first talk, she added a 984feet (300m) wire, representing the distance electricity could travel in a microsecond.

She’s quoted as saying, Here’s a microsecond, 984 feet. I sometimes think we ought to hang one over every programmer’s desk, or around their neck — so they know what they’re throwing away when they throw away microseconds.”[1] and used to retell a story of an admiral asking her, “why it took ‘so damn long’ to send a message via satellite”, to what she pointed out to him that in between where the admiral had sent the message and the satellite there were a great many nanoseconds[2].

This is still true, and important. Nowadays we use, or should use, Content Delivery Networks (CDNs) for our websites and applications. By having networks of nodes that keep copies of our content spread geographically, we allow users to take the shortest path possible when accessing our content.

This applies to your website and all additional resources it requires — be it images, fonts, stylesheets, scripts, videos. It provides other benefits as well, for instance, your web-server no longer has to serve these static assets, reducing the load they’re required to handle.

Besides distance, the other obvious aspect that impacts speed of delivery is the size of the assets being requested.

If you have two images, and one of them is double the size of the other, given the same distance and transmission channel, it will take twice to fully transmit it compared to the one half of its size.

In most cases this is as important as the geographical accessibility of the data, but traditional CDNs can’t help you with that. ImageEngine combines both a CDN and an asset optimisation pipeline and helps you significantly on this.

Image Optimisation

The way it does it is by automatically processing your assets after you define a distribution.

Let’s say you host your website along with its assets somewhere, in S3 parlance, an Origin. With ImageEngine you create a particular Engine for this Origin (you can have many) and without any further input, it will produce optimised versions of them, doing regular GZIP/BROTLI for normal assets, and applying state-of-the-art image compression to your image resources. It then creates a CDN distribution for these optimised assets that you can link to instead of your original unoptimised ones.

Images comprise the bulk of transfer data on the internet and are the main focus of ImageEngine. When compared to other similar solutions

The gains in compression are significant — without losing any perceptual quality. On top of this it also provides NextGen formats, such as JPEG2000 and automatically creates dimension aware variations of your original assets.

This is what the baseline functionality offers you without any customisation, and will cover most of the cases one could care about. ImageEngine CDN is also smart enough to answer with the best format a browser or application can understand, like JPEG2000 or WEBP. Because ImageEngine generates relevant variations on the display size of your images, you can leverage this in your code to request exactly the minimal sized assets you need, by using responsive images syntax, CSS media queries and even client hints.

Setting this up is as easy as signing up, clicking through a couple of screens and changing your application to point to your new distribution domain.

We can, nonetheless, improve further our CDN distribution using the Engine system provided by ImageEngine.

Improving your Distribution further

To make use of it we can create groups of Settings that are applied to an Engine. These settings can be applied to the whole Origin, or they can be applied to specific paths or file types. As an example, imagine you have two types of assets, under two different folders. One is high resolution images that for some reason you want to keep with very high quality — this might be detailed views of a room, or a product, and have a marketing value in being displayed in all their glory. The other is comprised of similar images but intended to be displayed solely in thumbnails, previews and such where you don’t need full fidelity, and instead you want the fastest time to first meaningful paint.

Using two sets of Settings you can generate one for the path /images-hi-res, where you specify a specific quality/size ratio — meaning it’s still optimised, takes advantage of the dimension variants and available formats, but does so while respecting the quality constraints you specify.

And another Settings for the path /images-regular, where you not only aggressively diminish the acceptable quality, but you apply as well a sharpening filter on them, in order to counterbalance the loss of perceptual resolution. These images, again, are made available as varied dimensions, but with their sizes lowered significantly.

This gives you full control on how you can optimise a given distribution. To drive this point home, I want to give you an example of my first experience with ImageEngine. I’ve studied photography and worked with image optimisation before, so I feel like I’m not totally clueless when it comes to it.

A Real World Sample

When talking with Jon Arne about writing for ImageEngine I mentioned — “Yeah, I actually don’t use these kinds of things because I know how to optimise my assets, I only need a CDN” — blah blah blah hubris.

I saw the value of it, I believed it was great for most people and businesses, and was still interested in writing about it, but sincerely, I doubted that my own manually prepared images would see a significant improvement in size. Look at these two 400px by 250px samples:

(© Ana Fernandes, Oak-Bark Druid, original illustrations)

(original)[https://www.jottacloud.com/s/0103dc38c4994554c6ca91c05d1c3455a32]

(IE)[https://www.jottacloud.com/s/010a18b368a5d3844638c5fcb0c5d7ca869]

Can you see any difference between them?

I can’t. But one is a 33Kb image and the other is a 68Kb image.

That’s half the size in difference. The originals were painstakingly resized, edited, optimised and exported using the latest Adobe Photoshop. ImageEngine still cut their size in half.

I didn’t have to do anything other than adding an Origin to an Engine. The bigger the dimensions of the original images, and the less optimised they were initially, the wider the gap in savings became. 800px by 500px images went from around 500kbs (already exported at around 80% jpeg quality) to less than 100kbs. That’s pretty significant.

If I apply any specific Engine Settings with a more aggressive quality/size ratio I can further take these sizes down. IE also respects the Save-Data header, if a user visiting your website has the setting enabled IE will provide them with a significantly smaller image without you having to do anything.

If you research the offerings in this space you’ll see that the actual prices for ImageEngine are quite competitive, even more, when you account that it’s better at what it does, includes a CDN service and that the size savings when compared translate effectively to a much higher bandwidth allowance (given the same bandwidth, content that is 30% smaller, means 30% more included bandwidth). This without taking into account the dimension aware scaling, that when used properly drives this usage further down. It’s quite a good deal for what it does. I have wasted more time optimising by hand those images than what it would have cost me to just pass them through ImageEngine and the results were still worse.

I believe that if you care about your user’s experience, if you want your website, store or platform to perform the best it can, and want to have the minimal footprint on the internet possible, you should give it a try.

When it comes to interactions with websites, everything else being equal, there’s nothing that helps better retain users than a snappy, fast to load experience, and there’s no excuse to not use the best solution to that problem when it’s a few clicks away — more so when that solution comes from people aligned with those objectives in many different ways. If you want to learn more, check-out imageengine.io

[1] in Grace Hopper, The Captain Is A Lady, interview by Morley Safer, 15 min., “Sixty Minutes”, 1983

[2] in “Grace Hopper: The Youthful Teacher Of Us All”, Henry S. Tropp, ABACUS 2 (Fall 1984)

--

--