Getting creative in the Pixel Shader World.

Antoine Fortin
9 min readMay 26, 2023

--

Note: Lately, I have been diving into a lot of P5.js and how creative coding helps to understand math by seeing it in action, yet, there is something a bit darker named shaders! These piece of software are wild animals, ruling under their own set of rules and workaround. This, will not be a tutorial about shader, but about proposing some ideas for 2D creative ideas inside a pixel shader. I want to point out that Pixel shader can be used for 3D effect, let’s just recall raymarching algorithm, but this is to move a step back. And get back with a basic canvas, 2 triangle, uvs and how to be creative on those.

Shadertoy and the why of it:

We will use Shadertoy, for the ease of use! I tend to find a lot of graphics programming tutorial rely on heavy setup to just explain how to do x or y. The reason I use Shadertoy, is that we have, out of the box, a nice fragment shader ready to be painted with code and maths. 2 triangles, usefull variables passed from the CPU and a lean editor.

Shadertoy runs in the browser, there is no reason to install anything else. Therefore, those techniques can be used in engine such as Unity and Unreal. Also, it makes sampling textures quite simple by assigning it from the get-go by selection an image.

Step 1: UV

When creating a new shader, there are some assumption, yet not to be taken easily, but most of the time, your area of work will be 0–1 both on the x and y.

You can show uv.x ->

void mainImage( out vec4 fragColor, in vec2 fragCoord )
{
vec2 uv = fragCoord/iResolution.xy;
vec3 col = vec3(uv.x);
fragColor = vec4(col,1.0);
}

You can show uv.y ->

void mainImage( out vec4 fragColor, in vec2 fragCoord )
{
vec2 uv = fragCoord/iResolution.xy;
vec3 col = vec3(uv.y);
fragColor = vec4(col,1.0);
}

And combine on the x and y axis, it gives us that:

void mainImage( out vec4 fragColor, in vec2 fragCoord )
{
vec2 uv = fragCoord/iResolution.xy;
vec3 col = vec3(uv.y + uv.x);
fragColor = vec4(col,1.0);
}

Step 2: Texture sampling

Just a review, on how to query data in shaders. You might want to animate your vertices in the Vertex.Shader since DX11, but we will not go into this. Instead, let’s just query a simple texture and apply it to the surface we are working on.

When we query a texture, we use the UV.

Code for this shader:

void mainImage( out vec4 fragColor, in vec2 fragCoord )
{
vec2 uv = fragCoord/iResolution.xy;
vec3 textureRGB = texture(iChannel0, uv).rgb;
fragColor = vec4(textureRGB,1.0);
}

Inside Ichannel0 you can assign anything you want. But let’s get back to UV, from the above code, we have UV from 0 to 1 sampling the texture making it feels the whole canvas. What about we tripple it?

Thing is: We can fuck around the UV a lot for querying a texture, or any procedural buffer defined inside our Fragment program.

Let’s just scale the UV by 12

void mainImage( out vec4 fragColor, in vec2 fragCoord )
{
vec2 uv = fragCoord/iResolution.xy;
vec3 textureRGB = texture(iChannel0, uv*12.).rgb;
fragColor = vec4(textureRGB,1.0);
}

We can play with texture sampling, changing the UV, rotate them, query them, project them to query different result. But what we need is a procedural 2D proceduralism.

Step 3: Getting creative with Polar.

At this point, you might ask: But how to use it?

Sharp question, being derived into a solution will be to tweak around UV into polar coordinate sampler.

Let’s remap uv from -1 to 1:

void mainImage( out vec4 fragColor, in vec2 fragCoord )
{
//vec2 uv = fragCoord/iResolution.xy;
vec2 uv = (fragCoord -.5 * iResolution.xy) / iResolution.y;
vec3 textureRGB = texture(iChannel0, uv*12.).rgb;
fragColor = vec4(textureRGB,1.0);
}

Then we can query it from a 2D mindset

Let’s now remap UV and query the texture with those

void mainImage( out vec4 fragColor, in vec2 fragCoord )
{
//vec2 uv = fragCoord/iResolution.xy;
vec2 uv = (fragCoord -.5 * iResolution.xy) / iResolution.y;
vec2 polarUV = vec2(atan(uv.x, uv.y), length(uv));
vec3 textureRGB = texture(iChannel0, polarUV).rgb;
fragColor = vec4(textureRGB,1.0);
}

It moves moves to (0,0) as it being the center. As we carry on and query another texture;

Already, we have some sort of more creative image than the simple UV mapping.

Step 4: Looping with time.

In Shadertoy, to get time we use the iTime variable. Represented as a linear value incrementing as the time goes on. One could blindly think that it is nice, we now have a linear time variable we could just plug it into function to make our things “animated”. Thing is that, in the shader world, we really love to stay in between the 0 to 1 one, where everything lays in between.

Using time as a function argument:

I would like to introduce the concept of times as a function, or, simply, how we will use time to create pattern for animation. We have iTime, representing the time since our shader started, but for more power, it is crucial to use time as something different than it’s linear value.

Here we have time in blue and time reduced by multiplying it by a number smaller than 1 in orange.

Those two are linear, meaning that as time(x) got bigger. I reduced the incrementation of time by a certain number, as we see in orange. The line grows

In our shader, we can write it like this:

void mainImage( out vec4 fragColor, in vec2 fragCoord )
{

vec2 uv = (fragCoord -.5 * iResolution.xy) / iResolution.y;
float time = iTime; // 1 second = full white
float timeSlower = iTime *.03; // slower
fragColor = vec4(vec3(time),1.0);
}

If I take time and apply it to the color, we will have a full white at 1 second.

Then if we apply timeSlower to the pixel color, we will have a slower one

So, lot of stuff to only express time, but I think it was worth writting that time can be used as a variable. In those two previous examples, I used time as a linear growing over time, and also alter it’s growing over time my reducing it by multiplying it by a small number.

The idea of time in shader programming is sometime a bit weird, as my examples were. But let’s just recap that time is linear in our code, and we can alter how it grows as we manipulate it.

Manipulate time into a function

Let’s assume that we want a shader that goes from black to white, and when it reaches white it goes back to black.

There is a very useful function in GLSL that allow us to do this. Called the fract() function, the way it works, is that we pass it a decimal number, and it only returns the decimal part of it, or in short, what is after the decimal.

fract(1.05) // -> return .05
fract(-44.56585) // -> return .56585
fract(12344.15) // -> return .15
fract(9945.55) // -> return .55

Then if we map time to it, we have the time growing, but always getting back to the 0 to 1 domain. Hacky way, but working way.

float alterTime(float time)
{
return fract(time);
}

void mainImage( out vec4 fragColor, in vec2 fragCoord )
{

vec2 uv = (fragCoord -.5 * iResolution.xy) / iResolution.y;
float tt = alterTime(iTime);

fragColor = vec4(vec3(tt),1.0);
}

You can reduce the speed that the canvas is getting back to white by adding a factor lower than 1 into the function. Let’s say we want to reduce this effect by a factor of .5, meaning we reduce the speed to be twice slower.

float alterTime(float time, float factortime)
{
return fract(time * factortime);
}

void mainImage( out vec4 fragColor, in vec2 fragCoord )
{

vec2 uv = (fragCoord -.5 * iResolution.xy) / iResolution.y;
float tt = alterTime(iTime, 0.5);

fragColor = vec4(vec3(tt),1.0);
}

Step 5: Creative with the tools

In the last part, we discussed time and how we can use it to create pattern using math. Let’s now use it to add it into our shader code.

We can change the length of the uv by making it 0 to 1 in a time manner.

void mainImage( out vec4 fragColor, in vec2 fragCoord )
{
//vec2 uv = fragCoord/iResolution.xy;
vec2 uv = (fragCoord -.5 * iResolution.xy) / iResolution.y;
vec2 polarUV = vec2(atan(uv.x, uv.y), length(uv + fract(iTime)));
vec3 textureRGB = texture(iChannel0, polarUV).rgb;
fragColor = vec4(textureRGB,1.0);
}

We added the length of the uv by the fractionnal part of the iTime.

vec2 polarUV = vec2(atan(uv.x, uv.y), length(uv + fract(iTime)));

At this point, we can now get creative with sampling the texture. How crazy could a + or * change the flow.

I love to dive in revert-proceduralism, meaning I remove instead of adding. Build the layer, and then build on this layer.

void mainImage( out vec4 fragColor, in vec2 fragCoord )
{
//vec2 uv = fragCoord/iResolution.xy;
vec2 uv = (fragCoord -.5 * iResolution.xy) / iResolution.y;
vec2 polarUV = vec2(atan(uv.x, uv.y), length(uv + fract(iTime)));
vec3 textureRGB = texture(iChannel0, polarUV).rgb;

vec3 outstuff = textureRGB;
fragColor = vec4(outstuff,1.0);
}

Yet we do not have a flawless looping, we are going to use this texture sampling in our advantages by simply removing rgb values we do not need or want.

void mainImage( out vec4 fragColor, in vec2 fragCoord )
{
//vec2 uv = fragCoord/iResolution.xy;
vec2 uv = (fragCoord -.5 * iResolution.xy) / iResolution.y;
vec2 polarUV = vec2(atan(uv.x, uv.y), length(uv + fract(iTime)));
vec3 textureRGB = texture(iChannel0, polarUV).rgb;

vec3 outstuff = textureRGB;
fragColor = vec4(vec3(1.0f, .5, .12) - outstuff,1.0);
}

What I did here is simply remove the useless color from the texture sampling, and already we have a more “looping” ready material.

Let’s now use these to query a texture without having it as a simple lookback.

void mainImage( out vec4 fragColor, in vec2 fragCoord )
{
vec2 uv = (fragCoord -.5 * iResolution.xy) / iResolution.y;
vec2 polarUV = vec2(atan(uv.x, uv.y), length(uv * abs(sin(iTime))));
vec2 polarUV2 = vec2(atan(uv.x, uv.y), length(uv / abs(cos(iTime))));

vec3 textureRGB = texture(iChannel0, polarUV).rgb;
vec3 textureRGB2 = texture(iChannel0, polarUV2).rgb;
vec3 outstuff = textureRGB +textureRGB2 ;
fragColor = vec4(vec3(1.0f, .5, .12) - outstuff,1.0);
}

The thing we want to take a look at is the textureSampling and how we fucked around the UVs. We add those two together.

vec3 textureRGB = texture(iChannel0, polarUV).rgb;   
vec3 textureRGB2 = texture(iChannel0, polarUV2).rgb;
vec3 outstuff = textureRGB +textureRGB2 ;

Then we reduce the amount of the root texture by removing some channels(RGB).

void mainImage( out vec4 fragColor, in vec2 fragCoord )
{
vec2 uv = (fragCoord -.5 * iResolution.xy) / iResolution.y;
vec2 polarUV = vec2(atan(uv.x, uv.y), length(uv * abs(sin(iTime))));
vec2 polarUV2 = vec2(atan(uv.x, uv.y), length(uv / abs(cos(iTime))));

vec3 textureRGB = texture(iChannel0, polarUV).rgb;
// vec3 textureRGB2 = texture(iChannel0, polarUV2).rgb;
vec3 outstuff = textureRGB /*+textureRGB2*/;
fragColor = vec4(vec3(1.0f, .5, .12) - outstuff,1.0);
}

I commented the double layer

Layer is now journey. Let’s just play with them. From the previous code, we now have a sandbow to explore

void mainImage( out vec4 fragColor, in vec2 fragCoord )
{
vec2 uv = (fragCoord -.5 * iResolution.xy) / iResolution.y;
vec2 polarUV = vec2(atan(uv.x, uv.y), length(uv * abs(sin(iTime))));
vec2 polarUV2 = vec2(atan(uv.x, uv.y), length(uv / abs(cos(iTime))));

vec3 textureRGB = texture(iChannel0, polarUV).rgb;
vec3 textureRGB2 = texture(iChannel0, polarUV2).rgb;
vec3 outstuff = textureRGB +textureRGB2;
fragColor = vec4(vec3(1.0f, .5, .12) - outstuff,1.0);
}

Here is the 2D layer of that what we had been writting.

Layers and colors.

--

--

Antoine Fortin

In between Montreal and London, I love to write, read, learn and explore.