WebGL Image Processing

Carlos Eduardo González Alvarez
cegonzalez13
Published in
4 min readApr 27, 2018

Image processing… sounds like a complicated term, right? Well, in WebGL it is fairly simple to do an example about this topic without any complication. Lets see how EvanW did it in his example:

Working Example

We are going to take as a visualization example this WebGL Filter implemented by EvanW. Unfourtunately, we found the source code minified and uglified (so other people can’t really understand what’s going on there), so we are going to take our chances trying to see what’s going on without a pretty looking code.

In the example’s source code, we see that he just sets up an UI for the image filter and calls some functions in a library. He doesn’t handle any WebGL code.

Now… that’s pretty much all of the code. He created a nice looking UI that calls the function associated with the effect and defined a fragment Shader for every effect. He then created a library with every effect and its association with the fragmentShader piece of code so we are going to look at it later. But first we have to understand… how did he do it?

We are now going to explain the logic behind his magic.

Drawing images

In order to do image processing, we need to draw an image first. To do this in WebGL we need to use textures (thank God we already explained this topic here). We explained in the textures example that WebGL expects texture coordinates when reading a texture. Texture coordinates go from 0.0 to 1.0 no matter the dimensions of the texture. Like we’ve done with the textures example, we need to add an attrivute to pass in texture coordinates and then pass them to the fragment shader.

This is an example of how you could define the texture coordinates and set them to the color passed in to the fragment shader. Finally, we need to add an image, wait for it to load and render it.

Modification of pixels

What if we want to do image processing that actually looks at other pixels? Since WebGL references textures in texture coordinates which go from 0.0 to 1.0 then we can calculate how much to move for 1 pixel with the simple mathonePixel = 1.0 / textureSize

So, for example, if we wanted to blur the image out we could take the average of the pixels around and show that fragment color in the corresponding pixel.

Finally, we then need to pass the size of the texture from Javascript.

Handling multiple effects

In order to set different effects in the image, we need to be able to alter the pixels around the current pixel with some sort of parameter. We know how to reference other pixels, so let’s use a convolution kernel to do a bunch of common image processing. In this case we’ll use a 3x3 kernel.

A convolution kernel is just a 3x3 matrix where each entry in the matrix represents how much to multiply the 8 pixels around the pixel we are rendering.

We then divide the result by the weight of the kernel (the sum of all values in the kernel) or 1.0, whichever is greater.

In Javascript we need to supply a convolution kernel and its weight:

For a clearer understanding of the convolution kernel you can see this explanation:

The GLFX Library

With the understanding of the logic behind the working example, we now can look at the library that EvanW defined to use his nice looking UI. He pretty much handled all of the WebGL logic for us, including the canvas, buffers, textures, etc. Now, we just have to apply the filter to the image that we want.

So… how did he do it? Well, apart from creating the primitives for the texture, canvas and buffer creation, he simply separated each filter by its own function and applied the corresponding WebGL Fragment Shader according to the selected filter and its values. Lets see it in the triangleBlur example:

How about multiple filters at the same time?

Unfortunately, the example doesn’t provide this but it should be really simple if you take into consideration the following flow:

This would be a flexible way to do it, using 2 more textures and rendering to each texture in turn, ping ponging back and forth and applying the next effect each time.

--

--