Pixel Shaders in Android App Development

Nezih Yılmaz
Mobillium
Published in
10 min readNov 15, 2023

Pixel Shaders, also referred to as Fragment Shaders, are specialized programs that execute on the GPU (Graphics Processing Unit). Their primary function is to manipulate and adjust the color of each pixel efficiently before these pixels are rendered on the screen. These shaders are a staple in game development, where they are employed to create a variety of visual effects, ranging from simple color changes in a View to implementing complex lighting environments. In this article, we will explore the application of Shaders in regular Android applications, demonstrating their versatility beyond gaming contexts.

Android 13 RuntimeShader

Shaders and hardware acceleration are not new concepts in the Android development system. Renderings such as a gradient color or a ripple click effect make use of them under the hood, and we have been using them for quite some time. However, they are predefined rendering operations; they are limited, and each of them serves only a specific use case. With the addition of Android 13 RuntimeShader, we have gained the ability to inject our own custom Shader functions directly from our high-level application code. Let’s take a look at how we can do this.

Using RuntimeShader in App Development

RuntimeShader works equally well both in the traditional View system and Jetpack Compose. Because of its simplicity, we will use Jetpack Compose in the following examples.

Using a RuntimeShader is pretty simple. The RuntimeShader object takes a String as input, the actual shader function. Once you create a RuntimeShader object, you can apply it on your Views / Composables as a RenderEffect or in any other way that had previously accepted a Shader object.

We are going to start with a straightforward shader function that applies a solid blue color:

val SolidColor = RuntimeShader("""
vec4 main(vec2 fragCoord){
return vec4(0.0, 0.0, 1.0, 1.0);
}
"""
)

Shader functions are written in AGSL (Android Graphics Shading Language). It is almost identical to GLSL except for some minor changes. Let’s see what’s going on in this shader code block String.

We have the main function that will be called for each pixel in the affected area.

The input parameter vec2 fragCoord is a 2D float vector that represents the position of a pixel on the screen. In this context, the pixel coordinate system’s origin aligns with that of the Android Canvas, located at the top left corner of the screen.

The return type vec4 is a 4D float vector that denotes the color of the pixel. This vector’s four components correspond to the red, green, blue, and alpha (transparency) channels of the color.

In this shader example, the return value vec4(0.0, 0.0, 1.0, 1.0), consistently represents a fully opaque blue color for every pixel. Let’s apply this shader to a Box Composable and observe the outcome.

Box(modifier = Modifier
.size(200.dp)
.background(
brush = ShaderBrush(SolidColor),
shape = RoundedCornerShape(30.dp)
)
)

The shader is applied to the Box as its background brush. When we run this code, the result is a Box painted in a solid blue color, as expected.

Solid Color Shader Output

Uniform Values

A RuntimeShader code can be given constant values as input before it runs. These are called uniforms. We will use a uniform value representing the solid color output for our next example.

val SolidColor = RuntimeShader("""

uniform vec4 color;

vec4 main(vec2 fragCoord){
return color;
}

"""
)

Uniform constants in shader code are declared at the top level, outside the main function. We have now declared a uniform of the type vec4, which represents a color. Unlike before, where our shader code outputted a hardcoded blue color for each pixel, it now returns the color defined by this vec4 uniform. This color must be supplied at runtime.

To set uniform values, we use the appropriate RuntimeShader::setTUniform function, which varies depending on the type of the uniform. Since our uniform is of the vec4 type (set of four Float values), we will use the RuntimeShader::setFloatUniform(uniformName: String, value1: Float, value2: Float, value3: Float, value4: Float)for this case:

Box(modifier = Modifier
.size(200.dp)
.background(
brush = ShaderBrush(
SolidColor.apply {
//set the uniform vec4 to a value that represents the color red
setFloatUniform(
"color", //uniformName parameter has to match the declared uniform name
1f,
0f,
0f,
1f
)
}),
shape = RoundedCornerShape(30.dp)
)
)
Solid Color Shader Output

A Real Use Case

To keep things simple, we have been working with solid color outputs and avoiding complex calculations. Therefore, using a Shader in this context is not practical for real-world scenarios. Now, let’s look into something a bit more complex. Next, we will develop a Shader that generates a horizontal gradient using two distinct colors.

val GradientColor = RuntimeShader("""

uniform vec4 startColor; //start color of gradient
uniform vec4 endColor; //end color of gradient
uniform float width; //total width of affected area

vec4 main(vec2 fragCoord) {
// Normalized position of current pixel.
float horizontalFraction = fragCoord.x / width;

// Linearly interpolate between the two colors based on the normalized horizontal position.
vec4 gradientColor = mix(startColor, endColor, horizontalFraction);

// Set the output pixel color.
return vec4(gradientColor);
}
""")

The horizontalFraction ranges from 0.0 to 1.0, where 0 represents the leftmost position and 1 the rightmost position. It is calculated by dividing the absolute X position by the total width. This process, known as position normalization, is essential for the subsequent steps.

The mix() function is a built-in method that performs linear interpolation between two values, based on a third parameter that represents the progress percentage. By using the current horizontalFraction as the progress parameter for linear interpolation, the mix() function will yield varying colors for each point along the horizontal axis. This results in the creation of a gradient color effect.

The shader is then applied to a Composable, similar to previous instances, but this time it incorporates more than one uniform value:

//...
.background(
brush = ShaderBrush(GradientColor.apply {
setFloatUniform(
"startColor",
1f, 0f, 0f, 1f
)
setFloatUniform(
"endColor",
0f, 0f, 1f, 1f
)
setFloatUniform(
"width",
with(LocalDensity.current){
200.dp.toPx()
}
)
})
Gradient Shader Output

RenderEffect

Previously, we have applied RuntimeShader to Composables as their background brush, which then rendered the background for the Composable. Additionally, RuntimeShader can be used as a Render Effect on a Composable. In this application, the shader post-processes the entire graphic layer of the Composable, including its children.

Box(modifier = Modifier
.graphicsLayer {
renderEffect = RenderEffect
.createRuntimeShaderEffect(
HighlightEffect,
"inputTexture"
)
.asComposeRenderEffect()
}) {
//Child composables declared in this Box Scope will also be affected
//by the Render Effect along with the main Box itself.
}

A RenderEffect can be created by using RenderEffect::createRuntimeShaderEffect(shader: Shader, uniformShaderName: String) function. It is then set to renderEffect field within a Composable’s graphicsLayerModifier scope.

Let’s take a look at the createRuntimeShaderEffect function parameters:

The first parameter, HighlightEffect, is a RuntimeShader that we will develop in the next step. This shader is designed to convert the entire texture of the Composable to grayscale, except for a specifically defined rectangular area. By excluding this region from the grayscale conversion, it effectively becomes ‘highlighted’, which is the rationale behind the name.

The second parameter, "inputTexture", refers to the uniform name that the Android System will provide to our shader, as declared in the first parameter. Within the shader code, this uniform will represent the Composable’s original texture before any processing. It is included to enable manipulation of the existing textures.

Highlight Effect Shader:

val HighlightEffect = RuntimeShader("""
uniform shader inputTexture; // Input texture. Its name has to match the one declared during creation of render effect.
uniform vec2 rectPosition; // Origin coordinate of the excluded rectangle
uniform vec2 rectSize; // Size of excluded rectangle
uniform float grayscaleIntensity;

vec4 main(vec2 fragCoord){
// Get the color from the original input texture
vec4 originalColor = inputTexture.eval(fragCoord);

// Check if the current fragment is inside the defined rectangle
if (fragCoord.x >= rectPosition.x &&
fragCoord.x <= rectPosition.x + rectSize.x &&
fragCoord.y >= rectPosition.y &&
fragCoord.y <= rectPosition.y + rectSize.y)
{
// If inside the rectangle, use the original color
return originalColor;
}
else
{
// If outside the rectangle, convert to grayscale
float gray = dot(originalColor.rgb, vec3(0.299, 0.587, 0.114));
vec3 grayscaleColor = mix(originalColor.rgb, vec3(gray), grayscaleIntensity);
return vec4(vec3(grayscaleColor), originalColor.a);
}
}
"""
)

In the shader code provided above, we have the inputTexture uniform. As previously mentioned, this uniform represents the original texture of the Composable being affected, and its value is automatically set by the Android System. It’s important to note that the type of inputTexture is also a shader.

We also encounter a new function: eval(vec2 coordinates. This function is a method of the shader object. When given a vec2 coordinate, it returns the color value at that coordinate as a vec4. We utilize this function to obtain the color of the input texture.

The processing logic in the code is structured as follows:

· Retrieve the original color at the current coordinates,
· Check if the current coordinates fall within the bounds of the excluded rectangle,
· If the coordinates are within the excluded bounds, return the original color. This implies that there will be no grayscale conversion at this coordinate
· If the coordinates are outside the excluded rectangle, convert the original color to grayscale at this coordinate. In this process, the grayscaleIntensityvariable, which controls the intensity of the grayscale conversion, is considered. A value of 0.0 indicates no grayscale effect, while 1.0 represents full grayscale. We will vary this value in each render cycle to animate the grayscale effect.

Now, let’s set the remaining uniform values in our RuntimeShader and observe the outcome:

//Animate intensity value.
val intensity = animateFloatAsState(
targetValue = if (highlightOn.value) 1f else 0f,
animationSpec = tween(durationMillis = 1000, easing = LinearEasing)
)

//...
Box(modifier = Modifier
.graphicsLayer {
HighlightEffect.setFloatUniform(
"rectPosition",
rectPos.value.x,
rectPos.value.y
)
HighlightEffect.setFloatUniform(
"rectSize",
rectSize.value.height,
rectSize.value.width
)
HighlightEffect.setFloatUniform("grayscaleIntensity", intensity.value)
renderEffect = RenderEffect
.createRuntimeShaderEffect(
HighlightEffect,
"inputTexture"
)
.asComposeRenderEffect()
}) {
//Child composables declared in this Box Scope will also be affected
//by the Render Effect along with the main Box itself.
}

As anticipated, the entire graphic layer of the Composable is converted to grayscale, except the floating action button area.

Next, we’ll explore another RenderEffect example: the Pixelation effect. Pixelation is a common graphic effect used to obscure the original input for various reasons, such as concealing spoilers or sensitive information.

In technical terms, pixelation involves transforming the input color matrix into larger blocks of solid color. These blocks, while obscuring finer details, should still retain a resemblance to the original image. Therefore, the color of these blocks is not chosen randomly. Additionally, we aim to provide a mechanism to reverse the effect when necessary and reveal the original image through a horizontal animation. Below is the shader code for the Pixelation effect, accompanied by comments on each line to elucidate their function.

val PixelateEffect = RuntimeShader("""
//original texture of the Composable
uniform shader inputTexture;
//represents the configurable size of said solid color pixelation blocks.
uniform float blockSize;
//total width percentage amount that will ignore the pixelation effect. This value will be used for horizontal reveal.
uniform float horizontalProgress;
//total width and height of affected Composable. stored in a 2D vector.
uniform vec2 resolution;

vec4 main(vec2 fragCoord){

//normalized horizontal position of current coordinate
float horizontalFraction = fragCoord.x / resolution.x;

//check if current position is at the left side of reveal progress.
//ignore the rest of the calculation and simply return the original color if so
if(horizontalFraction < horizontalProgress){
return inputTexture.eval(fragCoord);
}

//calculate the size of a pixel in texture coordinates
vec2 texelSize = 1.0 / resolution;

//calculate the texture coordinates of the center of the current pixel block
vec2 texCoord = floor(fragCoord.xy / blockSize) * blockSize;

//sample the input texture at the calculated texture coordinates
vec4 color = inputTexture.eval(texCoord + texelSize * 0.5);

return color;
}
"""
)

Let’s set the uniform values of this RuntimeShader and apply it as a RuntimeEffect on an Image Composable and see the result:

val pixelate = remember { mutableStateOf(false) }
val resolution = remember { mutableStateOf(Size(0f, 0f)) }
val blockSize = 18.dp
val horizontalProgress = animateFloatAsState(
targetValue = if (pixelate.value) 1f else 0f,
animationSpec = tween(durationMillis = 3000)
)
//...
Image(modifier = Modifier
.fillMaxWidth()
.height(300.dp)
.graphicsLayer {
PixelateEffect.setFloatUniform(
"resolution",
resolution.value.width,
resolution.value.height
)
PixelateEffect.setFloatUniform("blockSize", blockSize.toPx())
PixelateEffect.setFloatUniform(
"horizontalProgress",
horizontalProgress.value
)
renderEffect = RenderEffect
.createRuntimeShaderEffect(
PixelateEffect,
"inputTexture"
).asComposeRenderEffect()
//clipping is required in order to limit the shading area to the bounds of this composable.
clip = true
}
.clickable { pixelate.value = !pixelate.value }
)

//...

When we run this, the result is an excellent and smooth pixelation of 18.dp blocks.

In conclusion, RuntimeShader is a significant improvement over previous versions of Android with the flexibility it provides. And it’s an invaluable asset for any detail-oriented product.

While AGSL/GLSL and shader coding might initially seem daunting to an Android app developer, it’s important to remember that shaders have been a staple in graphics for a long time. Consequently, there is a wealth of inspiration and assistance available online for those looking to create custom effects.

The code related to this article is available on GitHub.

--

--