Image resizing made simple (and free)

Stephen Mckellar
MPB Tech
Published in
5 min readNov 22, 2023
Four identical images of a Sony camera, each image in a different size, against a dark blue background
Image: MPB

For as long as we’ve had images on the web, we’ve been dealing with the problem of finite infrastructure and infinite demand. Image files are generally large, and without optimisation can become the digital equivalent of fatbergs.

As a result we’ve a battery of image compression algorithms, machine learning to help us apply them, adaptive images in HTML and ever-faster pipes to deliver optimally crunched pixels to the end-user’s screen.

It’s fair to say that creating an automated system to serve up right-sized web graphics has never been more possible. So why wouldn’t you just go ahead and do it?

To which the answer is: because maybe you can do it for free instead, without even using much developer time.

What follows, then, is not ‘how to knit your own cropping algorithm’; instead we’ll be creating a Minimum Viable Product using the minimum possible code.

Is image size still important?

It’s fair to ask why, in an age of 5G and FTTP connections, anyone still needs to shrink image files.

But even if you discount those without access to the latest gadgets and ultrafast networking — and of course you shouldn’t — image download frustration can still affect us all.

Take product listings pages, for instance. At MPB we buy, sell and trade used photo and video gear. A key part of our offer is to let potential buyers view high-resolution imagery of the individual item they are considering.

No problem on a page devoted to a single product. There are plenty of ways to provide an uninterrupted user experience with a high-res gallery. But on a listings page containing hundreds of high-res images rendered as thumbnails, things can start to become distinctly stodgy.

Users are left waiting for images to load, while we require extra bandwidth (and use more energy as a result). It’s not as if all that fine detail is even visible at postage-stamp size.

Of course, the answer is not to get into that situation — after all, responsive images have been around a long time now. We can use the srcset attribute and the browser will load only the right-sized file for the device.

But to do that we must cut, upload and store each image multiple times. Doing so manually is a drain on human resources. The process needs automation.

So this is where we gather a Scrum team and work out how best to deliver a bells-and-whistles tailor-made image resizing service … isn’t it?

Cutting in the middleman

Yes, it is. But also no, it isn’t.

Building in-house is ultimately going to provide an optimal, cost-effective solution, but of course it’s also perfectly viable to outsource the problem. If you’re prepared to pay the price, there are services available to do the heavy lifting for you.

Cloudflare, the CDN that delivers our pages, has just such a solution. It will resize images on the fly, then cache them for next time.

We could just use Cloudflare and let our engineers work on other priorities instead, while customers enjoyed not waiting for graphics to load.

After all, implementation is a breeze. For each image variant we’d just need to add a prefix to the file path:

Before
/assets/canon-5d.jpg

After
/resize-service/width=100/assets/canon-5d.jpg

We could combine this with next.js’s next/image component to generate all the srcset variants we might need.

So that’s the first thing we tested. And it was darned near perfect. The prototype used 1/5th of the bandwidth and served pages 40 seconds faster over 3G.

The only problem was, when we crunched the numbers we were looking at additional Cloudflare fees of $1,000+ a month. Back to the drawing board, then.

Revise, retest, refine

Our enterprise account with Cloudflare includes a limited free usage allowance for its image-cutter. Could we find a way to stay within the free zone while maintaining the performance improvement?

We looked at the default server settings and found the cache time set to one hour. In other words, every image was being reloaded into the CDN every hour.

Changing cache time requires caution. If you later need to replace something, you have to wait for the timer to expire before you’ll see any changes on live. News or social media sites might find even a one-hour cache time unacceptable.

Fortunately, images are typically idempotent, and that was the case for our product images. We picked a still-very-conservative 24hrs cache time for retesting.

Trying time

We developed our Cloudflare resizing service behind a feature flag so we could turn it on and off without deploying any code. Initially we ran it on live for an hour to prove the system worked as expected, and to gather information about real-world traffic and performance.

We were surprised by the results, and not in a good way. As expected, demand for images peaked as the cache “warmed up”, then declined sharply. But the cache hit ratio settled at around 50% — in other words we were still serving one live image for every cached one.

That wasn’t really viable. Maybe we should seriously consider building in-house?

But the traffic logs turned up an anomaly — a huge amount of bandwidth being used by a single client. On investigation it turned out to be a bot, one of our own outsourced analytics processes. It was bypassing the cache entirely and downloading all 16 variants of each image.

We decided to have another go. We increased cache time even further, reduced the number of image variations from 16 to two, and redirected the bot to bypass Cloudflare.

A further test on live and suddenly we were in business. Now we were seeing a near-constant 97% of image requests returning a cached version. As we rolled out to all markets we saw similar results across the board, all well within our free image-resizing allowance.

Final thoughts

Depending on your circumstances, the moral of this story might be “don’t give up too soon”, or “sometimes it pays to circumvent the issue” or even “this is a neat way to leverage Cloudflare”.

For us at MPB the outcome has been out of proportion to the resources used. We’re using 95% less data for images — a reduction of 500TB a year — equating to 85% less data for the platform as a whole. We’re saving money because fewer requests hit our cloud-based image store. We’re even reducing our energy and carbon usage as a result.

Images are no longer a bottleneck for mobile users. And we’re so far inside our Cloudflare free tier that we can now look at optimising more pages, fine-tuning image quality and adjusting the number of variants.

Of course, we’re relying on a third-party service, and maybe we’ll still need to develop an in-house solution one day. We’ll also need to consider what happens if we need to add more image sizes in future designs.

But at the very least we have bought ourselves breathing space to tackle other projects. Meanwhile, time is money and we’re saving both.

Stephen Mckellar is a Senior Software Engineer at MPB, the largest global platform to buy, sell and trade used photo and video gear. www.mpb.com

--

--