Instagram filters in python

Travis Hoppe
Analytics Vidhya
Published in
5 min readSep 1, 2020

A weekend pytorch project to recreate the iconic filters from the a app.

Willow. Perpetua. Lo-Fi. Cream. Toaster. Whimsical names designed to evoke emotion. Dramatic effects of desaturation and wild swings of hue. I’m talking about Instagram filters and their near ubiquitous use on the app. It’s not just for show either — usage of a filter correlates to more views. It’s no wonder that the app places them in the forefront before every post, encouraging a soft glow or a retro faded photograph to be added for engagement. Can we recreate the effects outside of the app? Yes! This article explores how I did that to create a python module named instafilter.

A rose by any other name

What does a filter look like? Let’s take a look at a few examples:

Clockwise from upper left: Untouched, Toaster, Willow, and Walden.

In the upper right, we see a colorful rose and the result of three separate filters. Each one of them dramatically modifies the coloring, brightness, and saturation. One way we could learn a filter is through a 3D LUT (Look Up Table), that is to explicitly write down how every RGB pixel transforms. While that would work, using the full 8-bit space for each color channel, it requires 3*(2⁸)³ = 50,331,648 bits to store the whole table for every filter! If we quantize the LUT by only taking some of the of mappings and interpolating between them, the resulting image often has undesired banding issues due to quantization. Instead, let’s learn a transform. In a transform, the goal is to learn a function that takes in a pixel’s color and maps it to another color. It pays off too, the resulting models instafilter uses are only 9K bits, which is only about 300 32-bit numbers! To get started, we need to build a training set and a model.

Building a training set

To build the training set, I took the “Untouched” image as a reference photo. This comes from a project from József Fejes whose goal was to create an image with every possible RGB color represented. In theory, we could use a boring grid of pixels, this colorful rose allows us to visualize the results. Using the high resolution photo, I ran it through each filter on Instagram and saved the result. The result isn’t perfect however, since Instagram downsamples the image and saves it as a JPEG. In practice however, the resulting difference is small enough that the end result looks as good as the app.

My first thought for the training set was to train a transform:

f(R, G, B) -> (R’, G’, B’)

that is, given the Red, Green, and Blue color channels, learn a function that would map them to the new RGB colors. This didn’t work as well as I wanted, so I had to help out the network. Although the network trained with just RGB, it had to be much larger than I would have wanted to converge. Instead I used

f(R, G, B, L, S) -> (R’, G’, B’)

where L and S are the lightness and saturation. It’s simple to calculate these values from the RGB but it helped the network tremendously by doing some of the work upfront. This is the art of machine learning; I could see that Instagram filters modify lightness and saturation so I fed them into the input features. By giving it some additional information the network did not have to waste capacity learning those features itself.

Training the network

The model itself is really simple, a 4 layer fully — connected neural network:

This was my first time really using pytorch and I enjoyed how natural it seemd. It was really freeing to observe the variables and interact with them like regular python objects. If you’ve only ever used tensorflow, give it a try for your next project. To train the model, I borrowed ideas from fast.ai and used one-cycle with a learning rate finder. If you’re interested, you can see the training code directly.

Training took about 3 minutes per model and I reached a mean average loss of about 0.2. It’s not perfect, mostly due to the imperfections of my training set, but it doesn’t need to be. As far as I can tell, the output is indistinguishable from the real filters.

Using instafilter

It’s easy to use the python module yourself! First install it with pip

pip install instafilter

and now you can use it in your own code like this:

from instafilter import Instafilter

model = Instafilter("Lo-fi")
new_image = model("myimage.jpg")

# To save the image, use cv2
import cv2
cv2.imwrite("modified_image.jpg", new_image)

I’ll close with a few more examples:

Filters from left to right: 1977, XPro-II, Lo-Fi
Original images sourced from Instagram (1, 2, 3).

That’s it! Let me know if you like this or have any comments on the twitter thread.

--

--

Travis Hoppe
Analytics Vidhya

data aficionado; post-punk scientist; highly irrelevant