Unity WebGL — Making an Efficient Typewriter Effect

Christophe SAUVEUR
5 min readSep 22, 2018

For the past two years, we’ve been working at Cheese Burgames on several Unity WebGL elearning games. Those games, distributed through an online learning platform used by US schoolchildren, was developed with a very specific target device constraint: iPads and Chrome Books.

Although those devices are perfectly capable of running 3D heavy games when provided as a native app, it is another story when WebGL comes in (and yet another chapter with Unity WebGL). The performance gap between your desktop working station and the deployment device is way larger.

So, we had to find… creative solutions.

First stop: Why Unity WebGL?

Unity is the main tool at Cheese Burgames since we founded the company. When the offer to work for this new online learning platform was brought to our attention, the challenge seemed thrilling and we immediately took the opportunity.

WebGL logo

At that time, the SDK for the platform was provided only as a Unity package. Unity WebGL was not flagged as EXPERIMENTAL anymore, but WebGL evolves fast. Using Unity provided a stable middle-ware, without having to worry (too much) about compatibility or browser support. WebGL implementation of Unity has its own constraints but it was a pretty safe bet.

Test Often and Early

We knew we wanted to display dialogs progressively, like a typewriter would print characters sequentially on paper. So, pretty early in the process, we built a prototype of the first gameplay phase, including our dialog system.

In order to achieve this kind of text animation, we used DOTween and its DOText() shortcut.

DOTween’s DOText() shortcut sample code

And performance was terrible, about 10 FPS in average. After some profiling, we discovered that the problem came from the dialog texts components. Rebuilding the mesh after adding new characters to the Text component during animation decreased frame rate dramatically.

The naive solution

If adding new characters built a new mesh entirely for the text, we tried to use the rich text color feature to set characters as transparent. Here’s a sample code using DOTween’s DOVirtual.Float() function.

DOTween’s DOVirtual.Float() sample code

However, as you may have already guessed, it didn’t work! The fact is the characters mesh is generated each time the text property is assigned a new value.

Our Efficient Way

We had to dig inside Unity UI’s Text component mesh generation to understand how it works. As many, if not all game engines, Unity generates a mesh composed of a quad for each character of the text you’re displaying. Then it uses a texture generated from your type font to draw characters onto these quads.

Character quads

So, we deal here with the usual triad: mesh + material + shader. Going from there, the question is: What if we could alter the mesh generation process to add additional information for each character?

Setting this up

Our goal here is to give a kind of index to each displayed character. With that information, we will be able to use this index information in a custom shader that will not draw any part of the mesh past a specific index.

The best and easiest way to pass this new information is the second UV channel (UV1). By default, Canvas components give only the first UV channel to the shaders. So we have to set the Additional Shader Channels parameter of the Canvas component to include TexCoord1.

Canvas > Additional Shader Channels > TexCoord1

Altering the Mesh

We’re now ready to start altering the mesh by adding a second UV channel. Thanks to the IMeshModifier interface, we can modify meshes of UI components such as Text.

As long as your Text component does not have any cosmetic components such as Outline, the modification is fairly simple. You can safely split the vertices array in “groups of 4 vertices”, one for each character of your text.

Here is our code for this modification:

ProgressiveText with IMeshModifier

Associating a nuance of red to each index / character with a simple test shader makes the process more understandable.

Test representation of indices in red

The Shader Magic

Now everything is set up, we have to put the final touch to our system.

Our shader must be supported by the Text component. In order to achieve perfect compability, what’s best than taking the very own Text component’s shader as a base?

Nothing! So, we went looking for the UI/Default shader’s source code. All built-in shaders are available as a separate download for each version of Unity in the download archive.

The only thing we had to do is to add a new property to the shader to represent the display threshold. Every character with an index below this threshold would be displayed. Every index over wouldn’t. As we use a UV channel as the source of information, this threshold will be comprised between 0 and 1. We called it Character Phase.

Here is the resulting code:

ProgressiveText’s shader source code

The final step is to adapt our custom component script to support this new property, through the use of the IMaterialModifier interface.

And we’re done!

In-game capture of the result

Embracing Constraints

This solution is nowhere new. However, if the game had to be deployed on a more powerful platform (such as desktop or native mobile), we would never have to look for alternative solutions. Solutions that proved highly efficient in terms of memory and CPU consumption, and consequently more energy efficient also.

So embrace constraints! They’re a very good way to explore new grounds. And ultimately make better developers of you.

--

--

Christophe SAUVEUR

French Video Game Lead Developer @ Alt Shift. I experiment a lot. I share what I discover. Personal website: https://chsxf.dev