Constructing a Pixel Art Shader

Goal: Construct a 2D Shader that can be used by Unity’s TrailRenderer

I’d recently been wondering how to create dynamic pixel texture effects, such as beams or trails done in a pixel style. Shortly afterwards, I saw a post on twitter that demonstrated a really cool thing: a dynamically generate pixel trail behind a projectile!

Wow! Upon asking the user, he said it was all done with a Trail Renderer in Unity and a custom shader. This wasn’t the first time I’d heard of the dark magic of shaders, so I decided to try my hand at this technique, and I came up with the following result:

Final Trail Shader

Not bad! It fits all the criteria I was looking for: it snaps to the pixels in the scene, it has an outer and an inner trail, and its pixels gradually become transparent. So how does it work? Read on to find out my method for stumbling into something that looks kinda neat.

Basics of a Shader: Different Parts

So what exactly is a shader? Simply, a shader is a program that alters an image by procedurally affecting the lighting, underlying color hue, or other factors to produce a desired visual effect. The name “Shader” refers to a popular use of introducing gradual light and shadows to simulate light falling on the image. Since Unity treats 2D objects as 3D surfaces under the hood, we’ll have to go in a bit more in depth on how they’re treated in the engine. We’ll talk about 2 kinds of shaders: Vertex Shaders and Fragment (or Pixel) Shaders.

The Vertex Shader performs calculations once for each vertex of the polygons we shade, and is executed before the fragment shader. In the vertex shader we gather the information that will be sent to the Fragment Shader which can include space relative to the camera, how much fog is introduced, and the location on the texture.

Our Fragment Shader will largely care about the UVs given to it. UVs are texture coordinates that relate to the vertexes from which they are derived. It does its calculations once (or more, for shaders with multiple passes) for each pixel.

For more explanation I highly recommend visiting Michal Piatek’s blog entry here:

Basic Color Shader

Alrighty, let’s get ourselves a basic 1 color shader we can do some simple manipulation on. Here’s the code for this simple shader, piece by piece.

Shader "Tutorial/SingleColor"
{
Properties
{
// Color property for material inspector, default to white
_Color("Main Color", Color) = (1,1,1,1)
}
SubShader
{
Tags
{
"Queue" = "Transparent"
}

“Properties” will define a color that we can change in the inspector whatever material we attach the shader to, and it will default to white. Internally we will refer to it as “_Color”. Setting the render queue to Transparent will help our Trail Renderer display on top of other objects in the scene. If you are using Sorting Layers, you will have to set the Sorting Layer of the Trail Renderer’s Renderer separately.

Pass
{
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityCG.cginc"
struct appdata
{
float4 vertex : POSITION;
float2 uv : TEXCOORD0;
};
struct v2f
{
float2 uv : TEXCOORD0;
float4 vertex : SV_POSITION;
};

All of our logic will be within the Pass scope. First we define the fragment and vertex shader methods, and include the library that allows us to get camera coordinates, which we’ll use later.

The appdata struct is where we determine what information we will pass to the vertex shader. Here we define “vertex” with the keyword POSITION as the vertex position in object space, and “uv” with the keyword TEXCOORD0 which grabs us the UV. The v2f is what will be passed from the vertex shader to the fragment shader, and we’ll be passing some similar stuff, though we’ll transform the object coordinates to camera coordinates with UnityObjectToClipPos(v.vertex)

v2f vert(appdata v)
{
v2f o;
o.vertex = UnityObjectToClipPos(v.vertex);
o.uv = float4(v.uv.xy, 0, 0);
return o;
}
//externally defined color
fixed4 _Color;
// pixel shader
fixed4 frag(v2f i) : SV_Target
{
return _Color; // just return it
}
ENDCG
}
}
}

Whoo, that’s the last of it. Here we’ll grab those camera coordinates, define the color from the inspector for use in the shader with “fixed4 _Color;”, and just make a simple shader that only returns whatever color the inspector says instead of doing anything fancy.

Pixelization

Real quick I want to talk about my settings, and why they produce a pixelized image. All the pixelized trails you see are zoomed in on the game window, and all smooth trails are from the scene view. My settings have a pixel as 1 unit in unity’s object space, and I have not found any shader settings that prevent this from occuring in the game view as intended. I may go back and update this later if I need to create larger pixel blocks, but for now this is what works for me.

UV color alteration

Simply, a UV is a texture coordinate. In our trail renderer, UVs will range from 1 to 0 on both the X and Y axis. The coordinate system for the Trail Renderer looks like the following:

UV coordinates. Note that our shader’s colors supercede the Trail Renderer’s Color.

So, the front-to-back x coordinate of our trail renderer texture is 0 at the front, and 1 at the back. In the Trail Renderer, the total distance this covers is determined by the “Time” variable in the inspector and the objects own movement. The y coordinate, which may appear top-to-bottom, is in fact clockwise-to-counterclockwise, and y=0 will appear at the top instead of the bottom if the projectile is traveling to the left. Since our end goal is symmetrical with respect to the X axis we won’t have to worry about this, but other projectiles might. The total distance the y variable covers is determined by the “Width” variable, which can be made to modify itself over the length of the trail.

Great, so let’s get some cool colors in there! This is where the exciting math (wait, please don’t leave) comes in that lets us really determine what the final effect looks like. This will be done in the fragment shader part of the program. To start off, let’s make a gradual color difference between the center and extants of the Y axis. To do all this, we’ll have to make sure we’re only fading between colors for a fraction of the UV space, and then multiply that range N by 1/N.

fixed4 frag(v2f i) : SV_Target
{
float distY = abs(i.uv.y - .5) * 2;
return _Color*distY + _CenterColor*(1 - distY); // just return it
}
Neato!

So far so good. But we should make these two colors more distinct. Additionally, let’s make the center part fade to a darker color. We’ll shrink the range of the color change even more and introduce a new color shift if we’re in the middle.

fixed4 frag(v2f i) : SV_Target{
float distY = abs(i.uv.y - .5) * 2;
fixed4 c;
if (distY > .7){
   c = _OutsideColor;
}
else{
   if (i.uv.x > 0.8)
   c = _InsideFarColor;
   else if (0.8 >= i.uv.x && i.uv.x > 0.6){
   c = _InsideFarColor*((i.uv.x - 0.6) * 5) + _InsideNearColor*(1 -    (i.uv.x - 0.6) * 5);
}
else{
   c = _InsideNearColor;
}
if (distY <= .7 && distY > 0.5){
   c = _OutsideColor*((distY - 0.5) * 5) + c*(1 - (distY - 0.5) * 5);
}
}
return c;
}
Aw yeah. Those are some fine UV colors.

Alpha

We’ll have to add a bit of extra conditionals towards the top of the program to enable alpha:

SubShader
{
Tags
{
"Queue"="Transparent"
}
Blend SrcAlpha OneMinusSrcAlpha

Great! Now, what we want is to gradually increase the rate of alpha=0 pixels towards the x=1 UV. First, we’ll need a random number function for values between 0 and 1. Here’s what I use in the example:

float nrand(float2 uv)
{
return frac(sin(dot(uv, float2(12.9898, 78.233))) * 43758.5453);
}

And when we call it, we’ll pass it the v2f.vertex value that we set to the camera space position earlier. But wait, if we’re using the camera space coordinate, isn’t that going to be very deterministic? Yes it will! And this means that for any given pixel in the fragment shader, there will be a single point along its path where it becomes transparent, giving the effect of many particles that each fade out on their own.

In order to make this transition faster and a bit more constrained, we’re going to use a power function on uv.x rather than using the raw value. This gives us the final file:

Shader "Tutorial/Final"
{
Properties
{
// Color property for material inspector, default to white
_InsideFarColor("Inside Far Color", Color) = (0.2,0.2,1,1)
_InsideNearColor("Inside Near Color", Color) = (0.5,1,1,1)
_OutsideColor("Outside Color", Color) = (1,1,1,1)
_MainTex("pixel", 2D) = "white" {}
}
SubShader
{
Tags
{
"Queue" = "Transparent"
}
Blend SrcAlpha OneMinusSrcAlpha
Pass
{
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityCG.cginc"
fixed4 _OutsideColor;
fixed4 _InsideFarColor;
fixed4 _InsideNearColor;
sampler2D _MainTex;
float4 _MainTex_ST;
struct appdata
{
float4 vertex : POSITION;
float2 uv : TEXCOORD0;
float4 color : COLOR;
};
struct v2f
{
float2 uv : TEXCOORD0;
float4 vertex : SV_POSITION;
float4 color : COLOR;
};
v2f vert(appdata v)
{
v2f o;
o.vertex = UnityObjectToClipPos(v.vertex);
o.uv = float4(v.uv.xy, 0, 0);
o.color = float4(0, 0, v.uv.y, 1);
return o;
}
// vertex shader
float nrand(float2 uv)
{
return frac(sin(dot(uv, float2(12.9898, 78.233))) * 43758.5453);
}
// pixel shader
fixed4 frag(v2f i) : SV_Target
{
float distY = abs(i.uv.y - .5) * 2;
fixed4 c;
if (distY > .7)
{
c = _OutsideColor;
if (nrand(i.vertex) > 1 - (i.uv.x * i.uv.x * i.uv.x))
c = float4(0, 0, 0, 0);
}
else
{
if (i.uv.x > 0.8)
c = _InsideFarColor;
else if (0.8 >= i.uv.x && i.uv.x > 0.6)
{
c = _InsideFarColor*((i.uv.x - 0.6) * 5) + _InsideNearColor*(1 - (i.uv.x - 0.6) * 5);
}
else
{
c = _InsideNearColor;
}
if (distY <= .7 && distY > 0.5)
{
c = _OutsideColor*((distY - 0.5) * 5) + c*(1 - (distY - 0.5) * 5);
}
if (nrand(i.vertex) > 1 - (i.uv.x * i.uv.x))
c = float4(0, 0, 0, 0);
}
return c;
}
ENDCG
}
}
}
Huzzah!

Thanks for reading!

If you thought this was cool, let me know somehow, maybe I’ll do more! Feedback of course is always welcome. What was great? What was terrible?

Like what you read? Give Stephen Schroeder a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.