Pushing the Boundaries of Compose Multiplatform with AGSL Shaders
Creating stunning graphics across platforms with Compose and Shaders
Compose Multiplatform has been gaining significant attention, especially since its official support was announced at the last Google I/O. However, it’s important to continue exploring new challenges in this evolving field. This led me to investigate the feasibility of using shaders in this context. In this article, I will demonstrate how to create shaders with AGSL and run them seamlessly across various platforms.
To illustrate this, I developed a simple app where shaders play a central role. The project, called Photo-FX, is available on GitHub (there is also a web-based live-version here). The app allows you to select a photo and apply various effects to it, with the ability to adjust specific parameters of the chosen effect:
Shader’s anatomy 🧩
No need to worry — this isn’t a discussion about Shonda Rhimes’ award-winning TV show. Instead, we’re diving into the fascinating world of shaders and how to use them. Buckle up!
In simple terms, shaders are snippets of code that can be plugged into different stages of the graphics pipeline, where they are executed by the GPU. Depending on where they are inserted, their names and functions may vary slightly. There are several types, including fragment shaders, vertex shaders, geometry shaders, and tessellation shaders, among others.
We’ll focus on the first type: fragment shaders (also known as pixel shaders). These are particularly relevant because, as of Android 13, we can easily work with them and integrate them into Compose.
Fragment Shaders
To understand fragment shaders, it’s essential to first grasp the context in which they operate. Fragment shaders are GPU programs that run in parallel for every pixel in the screen buffer. Their primary function is to calculate the final colour of each pixel.
You can think of a fragment shader as a function with two key components:
- Input: The coordinates of the pixel being processed.
- Output: The color determined for that specific pixel.
This process happens simultaneously for all pixels in the screen buffer. The diagram below illustrates this process for a few pixels in a screen buffer:
Programming language: AGSL
To write shaders, we must use a domain-specific programming language known as a shading language. Shading languages differ from traditional programming languages in a few key ways:
- Parallel execution: As previously mentioned, shading languages are designed to run on the GPU, enabling the execution of many threads in parallel.
- Data Types and Precision: Shading languages include specialised data types, such as vectors and matrices, and support different precision qualifiers to optimise performance and memory usage.
- Input and Output: Shaders typically receive inputs from the CPU and pass outputs to the next stage in the pipeline or the framebuffer, relying on the graphics API for communication.
- No Standard Library for General-Purpose Programming: Shading languages lack standard libraries for general-purpose tasks like file I/O, networking, or threading. Instead, they focus on mathematical and graphical operations.
The shading language we use for Android is AGSL (Android Graphics Shading Language), which is essentially the same as SKSL, the shading language defined by the Skia library. Skia is the backbone for all low-level graphics in Android and Compose Multiplatform.
AGSL differs in some aspects from GLSL, the shading language defined by the Khronos Group — an industry consortium that creates open standards for graphics, compute, and media. The Khronos Group is responsible for the development and maintenance of the OpenGL specification, which includes GLSL. Although the two languages share many similarities — such as syntax, statements, certain types, and built-in functions — the most significant difference between AGSL and GLSL lies in their coordinate systems. In GLSL, the origin is typically located in the bottom-left corner of the screen, whereas in AGSL, it is positioned in the top-left corner.
Let’s explore a simple fragment shader that fills the screen buffer with red:
As you can see, AGSL is a C-like language, where every statement ends with a semicolon.
In line 1, we define the main function, our entry point into the shader. The function has a single input parameter representing pixel coordinates. We’ve named it fragCoord
here, but you can use any name that suits you.
In line 2, we create a value of the half4
type, which represents four 16-bit floating-point values. This value stores the colour of the pixel we’re drawing — red, in this case.
The image below shows the result produced by this shader:
And that’s it — our first shader! This is just the beginning of our journey into the world of shaders!
Uniforms
To make things more interesting, we can provide additional data to influence the visuals we create. This data will be shared among all parallel executions of our shader that make up a single screen buffer. To define this shared data, we use the uniform
keyword before the data type. The types of data we can send to shaders are limited to a few: integers, floats, and shaders. For instance, we can send the screen buffer dimensions to create a linear gradient along the x-axis, as shown in the snippet below:
In line 1, we define a uniform variable named resolution
of type float2
, representing a vector containing two floats. In line 2, we define another uniform variable named colour
, representing the RGBA values of the colour we’ll use to draw the gradient.
Line 5 is where the magic happens. Similar to the first shader we discussed, we return the colour for the current invocation. In this case, we calculate the ratio between fragCoord.x
and resolution.x
, mapping the current x-coordinate value to a range between 0.0 and 1.0. This value is then multiplied by the input colour
to produce the new colour value, effectively drawing a gradient that extends horizontally from black to the input colour, as shown in the live sample below:
A slider has been added to allow you to modify the input colour value. Feel free to experiment with it!
Inside Compose Multiplatform ⚒️
Before diving into the APIs available for adding shaders to our Kotlin Multiplatform projects, let’s first examine the fundamental building blocks that enable Compose Multiplatform to function seamlessly. Consider the following diagram:
This diagram presents a simplified version of how Compose Multiplatform is structured, specifically tailored to the context of shaders. As shown, when targeting the Android platform, we rely on the Android SDK for shader support. For other platforms, we depend on Skia, the open-source 2D graphics library primarily developed and maintained by Google. Skia provides its capabilities by leveraging various native libraries, depending on the platform:
- iOS: Metal and CoreGraphics.
- Windows Direct3D and GDI.
- macOS: OpenGL, Metal and CoreGraphics.
- Linux: OpenGL, X11 and Wayland.
- Web: WebAssembly and Canvas API.
This foundation allows Compose Multiplatform to render graphics efficiently across different environments, making it a powerful tool for creating visually rich applications.
Plugging Shaders into Composables
Returning to the Compose layer, we have a straightforward way to integrate shaders into composables. One particularly powerful tool is the graphicsLayer
modifier, which allows us to apply various graphical transformations and effects directly to a composable component within the rendering pipeline. The version of this modifier that interests us involves providing a lambda function with the following signature:
Within this lambda, we can adjust any property of the GraphicsLayerScope
receiver. Among the properties that allow us to modify scale, rotation, and translation, there’s one called renderEffect
— and this is the key element for incorporating shaders into Compose!
Useful APIs
With an understanding of how Compose Multiplatform is structured and what we need to create shaders, we can now explore the APIs available for shader creation. As shown in the earlier diagram, creating shaders for Android requires methods from the Android SDK, while other platforms rely on Skia.
The list below summarises these Android SDK APIs for your convenience:
RuntimeShader()
: Creates a newRuntimeShader
from input AGSL source code.RenderEffect.createShaderEffect()
/RenderEffect.createRuntimeShaderEffect()
: Create a newRenderEffect
from input params.RenderEffect.asComposeRenderEffect()
: Creates a Compose-compatibleRenderEffect
.
And this new list enumerates the Skia APIs:
RuntimeEffect.makeForShader()
: Creates a newRuntimeEffect
from input AGSL source code.RuntimeShaderBuilder()
: Create a newRuntimeShaderBuilder
from inputRuntimeEffect
.ImageFilter.makeShader()
/ImageFilter.makeRuntimeShader()
: Create a newImageFilter
from input params.ImageFilter.asComposeRenderEffect()
: Creates a Compose-compatibleRenderEffect
fromImageFilter
.
These API methods will enable us to create shaders using the appropriate tools for each target platform. The following diagram illustrates how to chain these calls to correctly provide the required renderEffect
inside the graphicsLayer
modifier:
You may have noticed that after creating the RuntimeEffect
or RuntimeShader
, there are always two methods available for the next step in both contexts. In each case, the second method creates a RenderEffect
or ImageFilter
that processes the content of the underlying composable. This allows us to use the graphic data within the shader in any creative way we can imagine — a crucial aspect of the Photo-FX project, as we’ll explore later.
And what about uniforms? How can we specify values for them? This, too, depends on the platform:
- Android SDK:
RuntimeShader
provides methods likesetIntUniform
andsetFloatUniform
, which take a string representing the name from the AGSL code and the corresponding value. - Skia: Uniforms are specified through
RuntimeShaderBuilder
using itsuniform
method. The signature is similar to that of the Android SDK: a string plus a typed value.
Hands-on with Shaders in Compose ✍️
With all the pieces in place, we can now create shaders and apply them to any composable we choose. For example, the following Android-based code snippet demonstrates how to create a shader that tints composable content with a red colour:
Let’s break down this simple shader code. in line 1, the variable that holds the AGSL code is annotated with @Language
, which helps the IDE correctly interpret and highlight the content of that string as AGSL.
In lines 3 and 4, we declare the uniforms needed in our shader. Finally, in line 6, the core functionality is implemented in a single line. By calling the mix
method, we combine two colours: the colour evaluated from the content shader and the input tint
value. That’s all there is to our shader.
Now, let’s focus on the glue code within the SimpleShader
composable. We call the method createRuntimeShaderEffect
, which requires an additional parameter named uniformShaderName
. This corresponds to the uniform declared in the AGSL code that will receive a shader representing the content of the composable where we’re applying this effect—in our case, the Box
holding the “Hello World!” text.
You’ll probably agree that this code is not the most elegant. Numerous instances are created within a few lines, all of which need to be combined correctly. Repeating this process multiple times can easily lead to errors. Moreover, this code is specific to the Android SDK, which limits its use in a Multiplatform context. So, let’s explore a better approach for working with shaders.
Designing a Multiplatform Shader API ✨
What should our ideal API look like? Here are some essential features:
- It should provide a single entry point for applying shaders to our composables.
- It should offer a robust mechanism for setting the values of shader uniforms.
- It should work seamlessly within a Multiplatform project.
Fortunately, we have specific mechanisms that address each of these requirements:
- We can implement a new Compose Modifier that serves as the entry point for shaders. This modifier can accept the AGSL code as a string and handle all the necessary API calls internally.
- Additionally, the new Modifier can include a scoped lambda that offers methods specifically for setting the values of uniforms.
- Lastly, the mechanism that enables smooth operation in a Multiplatform context:
expect
/actual
.
Show me the code
First, let’s explore the modifiers we’re going to implement:
Remember how we had two different methods for creating shaders in the Android SDK, as well as in Skia? That’s why we have two modifiers here as well. In both cases, the AGSL source code is passed as the first parameter. Additionally, an optional lambda allows us to set the values of the uniforms by calling methods exposed by the ShaderUniformProvider
interface, which we’ll discuss shortly. The runtimeShader
modifier also includes an extra string parameter for specifying the name of the shader that will receive the composable’s framebuffer.
These modifiers are declared with the expect
keyword, meaning the actual implementation is platform-specific. If this is your first encounter with the expect
/actual
mechanism, I recommend checking out the official documentation here.
“Won’t somebody please think of the uniforms?” If you’re a fan of The Simpsons, you’ll recognize this as a twist on Helen Lovejoy’s famous plea, “Won’t somebody please think of the children?” Well, just like Helen, I’m here to advocate — this time, for uniforms! That’s why I’ve decided to provide an interface for setting their values:
You might also need additional methods for passing colours or other value types, but for the purposes of Photo-FX, these were sufficient.
Skia implementation
We’ve already seen some code for creating shaders in Android, so it shouldn’t be too challenging to adapt that code to implement our new modifiers using Skia. Here’s how the Skia API is utilised:
Let’s break down the final implementation.
The new modifier combines the incoming Modifier
with a composed
one. This approach allows us to efficiently remember the RuntimeShaderBuilder
and ShaderUniformProvider
instances, which are potentially expensive objects that we should avoid recreating during each recomposition.
Additionally, since skippability is crucial in Compose, by providing only the AGSL source code as a string and a lambda for setting uniform values, both parameters can be treated as immutable. This means the modifier itself won’t trigger a recomposition unless necessary. The only part that may be re-invoked is the graphicsLayer
modifier, which is intentional — it allows us to update the uniform values used by the shader, enabling interaction from outside the modifier.
Within the graphicsLayer
, we also set the clip
property to true
to ensure that drawing does not extend beyond the boundaries of the composable to which the shader is applied.
Creating shaders for Photo-FX
Now that we have new modifiers to work with shaders, it’s time to apply them to something tangible. Our goal with Photo-FX is to use shaders to apply some visually appealing effects to images. To keep things simple for this introductory article, we’ve decided to implement just three effects:
- Vignetting: learn more.
- Smooth pixelation: Similar to pixelation, but using a sine wave for modulation.
- Chromatic aberration: learn more.
All three effects involve basic pixel manipulation techniques, such as darkening or altering RGB channels. We’ll focus on the details behind the vignetting effect, leaving the others for those interested in exploring the Photo-FX source code.
Vignetting effect
The vignetting effect darkens the edges of an image, drawing attention toward the centre. It creates a subtle gradient from the centre outward, where the shading intensity increases towards the edges.
This effect can be easily reproduced with shaders using the following code:
In this shader, we define four uniforms:
resolution
: represents the resolution of the content being processed.content
: this is the content or image on which the shader is applied.intensity
: controls how strong the vignetting effect is. A higher intensity will make the vignetting effect more pronounced.decayFactor
: controls how quickly the vignetting effect decays towards the edges of the screen. A higher value will cause a steeper decay.
Let’s move now to the body of the shader:
- Line 7: we normalise the coordinates to a range of [0,1] based on the
resolution
value. - Line 8: here we evaluate the
content
shader at the current fragment coordinate, retrieving the colour of the pixel from the content. The result is stored in thecolor
variable. - Line 9: In this line we essentially distort the
uv
coordinates, which will later be used to calculate the vignette effect. - Line 10: the vignetting factor
vig
is calculated by multiplying thex
andy
components ofuv
, scaled by theintensity
uniform. Theclamp
function ensures that the resulting value stays within the range [0, 1]. This factor determines how much the colour will be darkened based on the distance from the centre. - Line 11: in this line we adjust the vignetting factor by raising it to the power of
decayFactor
, which controls the rate at which the vignette effect decays as you move from the centre to the edges of the screen. A higherdecayFactor
will make the vignette effect drop off more sharply. - Line 12: Finally, we return the modified colour. The RGB components of the original colour are multiplied by the vignetting factor (
vig
), which darkens the pixel based on its position relative to the centre. The alpha component (color.a
) is passed through unchanged.
Next, we integrate this shader into the Photo-FX app by defining a new Modifier using the runtimeShader modifier, as shown below:
Job done! We can now enjoy our new shader effect in Photo-FX:
Conclusion
While working with shaders can seem daunting at first, I hope that the foundations shared in this article have made the process less intimidating. Using shaders in Compose Multiplatform is particularly exciting. Embracing new technologies and paradigms is what drives many of us to continue evolving as developers, and learning how shaders work — and even writing your own — is a valuable investment in your skills.
If you’re interested in continuing your shader education, here are some recommended resources:
- Explore Shadertoy, an incredible platform where you can discover amazing creations by talented individuals and start writing your own shaders.
- Another useful resource is Skia Shaders Playground where you can plug your AGSL code without any change. Remember that GLSL and AGSL have different coordinate systems.
- Check out the fantastic video tutorials from The Art of Code on YouTube: https://www.youtube.com/@TheArtofCodeIsCool
- For more advanced topics, I highly recommend the content from Iñigo Quilez, the creator of Shadertoy. Visit his website: Iñigo Quilez’s Articles and his YouTube channel: Iñigo Quilez on YouTube.
Additionally, here are the links to the Photo-FX repository on GitHub and the embedded diagrams I’ve used to make the explanation clear (check out my previous article to learn how to embed Compose into Medium articles):
- Photo-FX — web version: https://manuel-martos.github.io/Photo-FX/
- Photo-FX: https://github.com/manuel-martos/Photo-FX
- Photo-FX-Materials: https://github.com/manuel-martos/Photo-FX-Materials
Thanks for reading to the end!