Texture Atlasing: An Inside Look At Optimizing 3D Worlds!
Combine those textures and get a performance boost
When developing mobile applications in Unity, ensuring everything is optimized is always crucial. Maximizing frame rates lead us to focus on optimizing scripts, baking lights, modifying objects, etc.
When we bring our mobile application into virtual reality, this becomes even more crucial. While there are many areas we can focus on in order to take steps to optimize, we’re going to just focus on one, the texture atlas.
Why A Texture Atlas?
The texture atlas is a way for us to logically group all of our games’ images or textures into one file (also called a sprite sheet). When objects are created in modeling programs like Blender, the coordinates of each image can be mapped onto the objects we want to use. This makes rendering a lot more efficient in Unity, where we can have a lot of objects sharing the same material. When we follow this up with making our objects static, this allows Unity to use static-batching where it combines “(not moving) GameObjects into big Meshes, and renders them in a faster way.”
Making our Map
In order to create and use a texture atlas, we need to be able to create our own objects for Unity as well as place our images into one file. For placing the images into one graphic, we can use Gimp, a free image-editing program. For mapping our atlas onto an object, we can use Blender, a 3D modeling program.
Adding the Images
For the sake of this demo, we are just looking at a few objects below. Thinking of a few materials we want to use, we can grab them online and save them to our computer.
Once we have the images saved, we can create a 1028x1028 image in Gimp and drag and drop each image into our texture atlas. The size of the image can be anything as long as it’s not massive.
When gathering images for a texture atlas, it makes sense to logically group our images. If we have a building, we could put all those textures (bricks, walls, floors, etc) in one texture atlas. For our characters, we could do the same. This keeps things organized, but we’re not being picky in this case. Let’s just throw them all in here.
One thing to note is these are simple images. There are programs out there that can generate elaborate textures for 3D models. However, in our case we can still get a nice texture on our shapes without it.
Creating and Mapping Our Objects
With a texture atlas completed, the next step is to open up Blender. We can add primitive shapes to work with by holding
shift + A and selecting the Mesh>Cube.
The next step is to split the windows so we can see our object and soon our UV Image Editor. The top right corner of the window we can click and drag to the left to create our two views.
Then at the bottom, select the Editor Type icon to switch to
UV/Image Editor view.
With the views now set up, we can map things out. When Unity assigns materials to game objects, it uses UV coordinates as reference in order to properly map out each texture. UV is much like the X/Y coordinate system. If undefined, our entire texture atlas would be mapped to the object. As we’ll soon find out, by defining the UV coordinates of the model to the atlas texture, Unity will know what part of object matches up to the texture atlas.
Selecting the left window, we can hit
tab and switch to
Edit Mode. This lets us select all or individual parts of the object. By hitting
a we can select everything.
Unwrapping Our Object
With all of our object selected, we then have to unwrap it. This takes our 3D shape and
unwraps it onto a 2D plane so we can define what part of the texture to map it to.
Back on the 3D view of our Cube we can hit
u for unwrap and select
Unwrap. This will unwrap all of the faces on top of each other in our UV/Image Editor. If we were to select
Smart UV Project we could get all the faces mapped out next to each other. This can be helpful when unwrapping complex objects.
Next, open our texture atlas and load it up.
This will load up the image and let us see where our unwrapped image falls onto the texture. Obviously, we don’t want the whole image so we need to hit
a to select everything. Then we can then hit
s to scale the selection to our chosen texture.
Once the unwrap is scaled down to our liking, we hit
g to grab it and move it to whatever texture we’re targeting. In this case, the light wood grain.
Once that’s done, we can save the file and we now have an object with mapped UV coordinates, referencing this texture atlas.
Mapping Different Faces
Before jumping into Unity, it should be noted that we can select each face individually. In the case of our cube, each face could be mapped to a different texture in our atlas and it would work well. With the cube already mapped in this case, we could select once face of the cube in the 3D viewer. We need to make sure we are in Edit Mode and that we can select faces.
Its corresponding unwrap is shown below. Going to UV/Image Editor and hitting
a to select all the points, we can then hit
g to move our unwrapped cube face to whatever texture we want to assign.
We can do this for every face or just a few to achieve more elaborate texturing. This can be useful when our models become more complex.
Before jumping into Unity we create a couple objects (Sphere, Cylinder, etc) and get them mapped onto our texture atlas. Then we’re ready.
Bringing Our Objects Into Unity
Once we have objects created in Blender that have been mapped to our texture atlas, we can then drag those objects into Unity, into our project window. If we have a prefabs folder, we can throw them in there.
Usually, our objects and the material for them will be imported into Unity inside of a
Materials folder. Sometimes though, you may see that you might have just wasted an hour unwrapping objects. If that happens, there’s a fix. In Unity we should have our texture atlas imported. We can put that in with the rest of our textures.
Then right-click in the project window and select Create>Material. With the Material selected, look at the inspector and make sure we have selected Mobile/Unlit(Supports Lightmap). Since we’re focusing on mobile, we are going to be baking all of our lights into our textures. Assigning a Mobile/Unlit shader is a great way to accomplish this.
Once we’ve assigned Mobile/Unlit, we can select the texture that will be associated with this material. This is where we click on
Select and find our texture atlas.
Now that we have a material using our texture atlas, any objects that we have mapped out in Blender nicely sync up with the texture we have designated.
Make it Static and Bake the Scene
With all of our objects set up with the same material, our next step is to take advantage of Unity’s static batching. This is quickly done by marking all of our objects as static. Assuming these aren’t animated in our app, this is a must do.
With some spotlights for a nice effect, we make sure these are marked as
Area(baked only). With this, our scene obviously gets some light, but also our objects can now bake the lighting onto their textures since they are now static.
To make these changes visible, we can go to
Window>Lighting>Settings and generate the lighting for our scene.
So Where’s the Static Batch
We mentioned in the beginning, static batching is a way to optimize our scene rendering. Unity takes static objects that have the same material and renders then as one big mesh to speed things up. Less draw calls in Unity mean more frames per second and when we have 1000+ objects, we can see why this can be so important.
Since we created a texture atlas for our objects, Unity can accomplish one pass to draw all of our game objects. One way to see this in action is by using the Frame Debugger.
Unity’s Frame Debugger
The Frame Debugger lets us pause the game during play mode and break down how each frame is rendered. By stepping through each item in the hierarchy on the right, we can see the order of each render. Notice the
Static Batch line is highlighted and below in the game view, we see Unity is rendering all of these objects at once with one draw call. Nice, right?
Had we assigned a different material to each object, we would have increased our draw calls and forced Unity to render each object individually as seen below. Not very efficient.
Atlases, Unwraps and Batches….oh my
Generating a simple texture atlas can allow us to create a shared material in Unity across many game objects. If we’re able to create or modify our models in a 3D modeling tool like Blender, we can take more control over the mapping of these textures and make our app run more efficiently.
Even though mobile devices are becoming more powerful, we continue to push the limits of the hardware. Among the many ways we need to optimize our applications to provide the best user experience, this is just one important step to get us there. Remember the texture atlas.