Uploading and serving images from Redis with Node.js

Kyle
3 min readJan 11, 2016

--

This is part 18 of my Node / Redis series. The previous part was Dodge the busy work: Using lodash and async to write tighter Node.js / Redis apps.

Hey! If you’re reading this, maybe you want to go to RedisConf 2018 for free?

With Redis, you think small. Storing what you need to in-memory and bulk storage elsewhere. Clients often don’t understand the sense of scale when it comes to data — the idea of having a phone only with 16gb of storage is unthinkable, but a web app that fits comfortably in 1gb is very weird to them. It’s all about what you are storing — likely your iPhone is not storing large amounts of text, but rather large amounts of HD videos and photos.

Recently, I had a project that requires extremely quick upload and downloads of images. Designing an app that can be deployed to multiple servers with uploads can be a problem as many Node.js upload examples floating around the web upload to a folder on the server. To avoid having uploaded files scattered across multiple servers, I usually use a cloud storage solution (S3) and use the node process to pipe the data to and from. It works, but that extra latency of connecting to an additional server is really a drag on performance.

My wacky idea is to build a system to upload images directly to Redis. In my case, the images are small, less than 400x400, but still this results an image that is 8–12kb each. So, what do you need to do to make this happen?

Multipart, Binary and Buffer oh my!

Redis will gladly accept anything you throw at it since strings are binary safe. On the retrieval end you’ll be dealing with binary data too, you’re going to want to make client return buffers instead of text. Likely this also means you’ll need a couple of connections — one for buffer work and one for string work.

client = redis.createClient({ return_buffers : true });

Web file uploads are multi-part — this means that our standard req.body.file won’t cut it. For this, we’ll use busboy, which works from stream events.

Now, this is cheating a bit — it loads the whole file into a variable then saves it to Redis. With a huge file, you could run into problems. For my use case, it is fine but I’m thinking you could use APPEND instead of concat in your on(‘data’) callback to effectively stream in the data one chunk at a time.

Serving the image isn’t too complex either.

Again, we’re loading the file entirely into a variable then serving it. You could likely use the Redis pseudo-streaming to mitigate these problems if you’re going to be dealing with large files.

Next Steps

This could be the basis for a lot of fun Node / Redis shenanigans. Imagine using this as your first step in the upload process — you accept the file upload via Redis for performance but then lazily move it over to a slower blob storage. Or you could put a TTL on the keys and create an ephemeral image store. You could substitute get / set for hget / hset and create a time-stamped photo album. Maybe use this technique with gm to decentralize image processing. Or you can just build on this to create a lightning fast image server. So many possibilities.

--

--

Kyle

Developer of things. Node.js + all the frontend jazz. Also, not from Stockholm, don’t do UX. Long story.