WebGL and Image Filter 101

David Guan
David Guan
4 min readAug 19, 2017

--

WebGL is the foundation of most of the 3D applications on the Web; more than that, it can also be used to enhance images effects.

An article published by AWWWARDS about websites with excellent image effects

The reason behind WebGL’s widely usage is it gives us the opportunity to use GPU using javascript API and GLSL.

Get Started

To simplify things, we can treat WebGL code as
Initial setup + Vertex Shader + Fragment Shader + Data -> Graphics

Because GPU can’t understand javascript code, we have to compile code(shaders) and link the compiled results into programs on the javascript side, configure GPU to run it, that’s what makes the code hard to understand(and the reason why people choose WebGL frameworks like three.js).

Let’s start by drawing a square(If you are new to WebGL, keep in mind that you don’t have to understand all the code, in this article, we just focus on fragmentShaderSource).

The gl_FragColor is a built-in variable available in the shader for specifying the color for a fragment(in RGBA format).

By applying the v_coord, a variable passed by vertex shader, we can implement gradient affect.

const fragmentShaderSource = `
precision mediump float;
varying vec2 v_coord;
void main() {
gl_FragColor = vec4(v_coord.rg, 1, 1);
}
`;

Or use length and mod to draw circles.

const fragmentShaderSource = `
precision mediump float;
varying vec2 v_coord;

void main() {
vec2 pos = mod(v_coord * 5.0, 1.0);
float smoothResult = smoothstep(0.5, 0.46, length(pos - 0.5));
gl_FragColor = vec4(pos.rg, 1, 1) * smoothResult;
}
`;

We can try more endlessly, so let me stop at this point.
glslsandbox.com is a website for people to uploading animated graphics implemented by their fragment shaders.

Image Filter

By binding image(texture) data to GPU, we can sample the image color in the shader code to display image or apply filters/effects.
Below is the code for display an image with WebGL:

So, to make the image looks brighter, we can start by adding a constant value to each fragment’s RGB property:

const fragmentShaderSource = `
precision mediump float;
varying vec2 v_coord;
uniform sampler2D u_texture;

void main() {
vec4 sampleColor = texture2D(u_texture, vec2(v_coord.x, 1.0 - v_coord.y));
sampleColor.rgb += 0.1;
gl_FragColor = sampleColor;
}
`
const fragmentShaderSource = `
precision mediump float;
varying vec2 v_coord;
uniform sampler2D u_texture;

void main() {
vec4 sampleColor = texture2D(u_texture, vec2(v_coord.x, 1.0 - v_coord.y));
sampleColor.rgb += 0.4;
gl_FragColor = sampleColor;
}
`;

By using this way to implement brightness, the image’s contrast is lost. Instead, we can try rgb *= 1.4 instead of rgb += 1.4.

rgb *= 1.4

Kernel Filters

To use kernel filter, we have to sample the colors of adjacent points for any fragments on the image, so I added another variable imageResolution in the shader to achieve that goal.

So below is the result of applying edge detection filter(jsfiddle link):

  const fragmentShaderSource = `
precision mediump float;
varying vec2 v_coord;
uniform vec2 imageResolution;
uniform sampler2D u_texture;

void main() {
vec2 pos = vec2(v_coord.x, 1.0 - v_coord.y);
vec2 onePixel = vec2(1, 1) / imageResolution;
vec4 color = vec4(0);
mat3 edgeDetectionKernel = mat3(
-1, -1, -1,
-1, 8, -1,
-1, -1, -1
);
for(int i = 0; i < 3; i++) {
for(int j = 0; j < 3; j++) {
vec2 samplePos = pos + vec2(i - 1 , j - 1) * onePixel;
vec4 sampleColor = texture2D(u_texture, samplePos);
sampleColor *= edgeDetectionKernel[i][j];
color += sampleColor;
}
}
color.a = 1.0;
gl_FragColor = color;
}
`;

Thank you for reading this article!

--

--

David Guan
David Guan

I program machines to do web and graphics stuff, also write about them.