Real-time Embroidery Rendering for Mobile Games

Esben Sloth
LykkeStudios
Published in
9 min readJun 6, 2023

The Fabric of Interactive Graphics

Our game, stitch., seamlessly blends artistry and technology, offering a world of digital embroidery.

As developers, we constantly find ourselves standing at the crossroads of creativity and technology, tasked with the challenge of turning visions into reality. With the evolution of digital art, one particular challenge has woven itself into our journey — the intricate art of embroidery.

A stunning embroidery artwork featuring an adorable penguin, showcasing the intricate stitchwork and attention to detail that can be achieved with the art of embroidery.

Embroidery, a timeless and rich art form, has always been an eye-catcher. With its detailed stitches and vibrant threads, it embodies a level of sophistication that transforms fabric into a canvas teeming with stories. The combination of these vivid textures and intricate details is no small feat to recreate digitally, especially when targeting the resource-limited environment of mobile devices.

The penguin artwork rendered within the stitch. game, showcasing the dynamic interplay of colors, shapes and textures that bring the embroidered character to life on the screen.

In our recent iOS game, stitch., we pushed the boundaries of interactive digital art by marrying the artistry of embroidery with the mind-bending puzzles of shikaku.

In this article, we unravel the threads of our journey. We discuss the novel procedural rendering techniques that allowed us to bring embroidery to life, all within the framework of Unity’s versatile 3D engine.

We hope this deep dive not only sheds light on the intricate process behind our real-time embroidery rendering system but also inspires fellow developers to push the boundaries of what’s possible in mobile gaming. After all, the canvas of game development is as broad and diverse as the fabric we choose to weave our stories on.

The Grid: Enabling Artistic Freedom

Our gameplay draws inspiration from the newspaper Shikaku puzzle, which traditionally revolves around a grid structure. However, we face a quandary when it comes to creating artwork that can freely express diverse styles of embroidery. To address this challenge, we introduce a solution that allows for the subdivision of the grid and grants flexibility to the position of its points. This novel approach empowers artists with the ability to explore a myriad of shapes and forms.

A wireframe of the grid created in Blender, forming the foundation for the embroidery.

For the creation of our grid artwork, we employ the powerful tools provided by Blender. The process involves generating a UV-mapped mesh and a corresponding pixel bitmap. Ensuring the alignment of the UV coordinates with a regular grid, we preprocess the mesh during the import phase into Unity.

The rendering of stitch. is embodied within an embroidery hoop.

While the edges of the artwork retain their freedom of movement, it is imperative to align the overall structure with a regular grid. Once the grid is preprocessed, the original meshes are no longer necessary and are excluded from our builds.

Each cell’s region index attribute corresponds to an index in the RegionBuffer, as indicated by the labeled cells.

We save point positions and cell region indices, but edge connectivity is implied. We also have an array of regions. To send the necessary data to the graphics processor, we utilize compute buffers. By using Unity’s Graphics.DrawProcedural function, this approach eliminates the need for meshes or compute shaders.

Stitching the Grid: Understanding Vertex ID Handling

An important part of the shader code involves utilizing the Vertex ID SV_VertexID to determine the corresponding cell and point in our grid system. By leveraging this information, we establish connections between primitives, cells, and points, leading to improved rendering efficiency.

The shader retrieves indexed data from buffers in multiple stages, facilitating efficient data handling and ensuring a unified rendering approach for player input and hoop artwork.
// Compute buffers contains all our gird's data
StructuredBuffer<Point> PointBuffer;
int PointBufferLenght;
StructuredBuffer<Cell> CellBuffer;
int CellBufferLenght;
StructuredBuffer<Region> RegionBuffer;
int RegionBufferLenght;

// Defines the dimensions of a primitive object in the grid.
uint2 primitiveDimensions;
// The number of subdivisions in the grid.
uint subdivisions;

// This is the vertex shader which calculates the vertex data for each vertex of a primitive.
// Each primitive corresponds to a cell in the grid.
Varyings vert(SV_VertexID vertexID)
{
// Calculate the index of the primitive this vertex belongs to.
uint primitiveIndex = GetPrimitiveIndex(vertexID);
// Calculate the UV coordinates within the primitive for this vertex.
uint2 primitiveUV = GetPrimitiveUV(vertexID);

// Calculate the index of the point in the grid that this vertex corresponds to.
uint pointIndex = GetPointIndex(primitiveIndex, primitiveDimensions, primitiveUV);
// Calculate the row index of the primitive in the grid.
uint primitiveRow = IndexToRow(primitiveIndex, primitiveDimensions);
// Calculate the index of the cell in the grid that this primitive belongs to.
uint cellIndex = GetCellIndex(primitiveIndex, primitiveRow, primitiveDimensions, subdivisions);

// Load the point, cell, and region data corresponding to this vertex.
// The min and max operations ensure that the indices are clamped to the range of valid indices.
Point point = PointBuffer[max(min(pointIndex, PointBufferLenght - 1), 0)];
Cell cell = CellBuffer[max(min(cellIndex, CellBufferLenght - 1), 0)];
Region region = RegionBuffer[max(min(c.regionIndex, RegionBufferLenght - 1), 0)];
// use loaded data
}

The smallest unit of our subdivided grid structure is a quadrilateral geometric primitive. Although it doesn’t have any associated data, we calculate its attributes during the process of reconstructing the grid.

Our grid consists of three foundational elements: points, subdivision primitives, and cells. Points serve as the vertices or “corners” of the grid, while cells represent the individual units or “pixels” within it. Through the subdivision process, each cell is divided into smaller primitives, allowing for precise control over the grid’s geometry. In this particular illustration, a cell is comprised of four primitives.
// Returns the index of the primitive that a vertex belongs to.
uint GetPrimitiveIndex(uint vertexIndex)
{
return vertexIndex / 6u;
}

// Returns the UV coordinates of a vertex within its primitive.
uint2 GetPrimitiveUV(uint vertexIndex)
{
uint primitiveVertexIndex = vertexIndex % 6u;
// Bit-shift and bit-mask operations are used to calculate the UV coordinates.
// The numbers 0x2Cu and 0x32u are binary representations of the sequences of U and V coordinates for the vertices of a primitive.
uint2 primitiveUV = uint2(
0x2Cu >> primitiveVertexIndex & 0x1u, //0b101100
0x32u >> primitiveVertexIndex & 0x1u //ob110010
);
return primitiveUV;
}

// Returns the index of the point in the grid that corresponds to a vertex.
uint GetPointIndex(uint primitiveIndex, uint2 dimensions, uint2 primitiveUV)
{
// The point index is calculated based on the index and UV coordinates of the vertex, and the dimensions of the grid.
return primitiveIndex + primitiveIndex/dimensions.x + primitiveUV.x + primitiveUV.y * (dimensions.x + 1u);
}

// Returns the row index of a primitive in the grid.
uint IndexToRow(uint index, uint2 dimensions)
{
return index / dimensions.x;
}

// Returns the index of the cell in the grid that a primitive belongs to.
uint GetCellIndex(uint primitiveIndex, uint primitiveRow, uint2 dimensions, uint subdivisions)
{
// The cell index is calculated based on the index and row of the primitive, the dimensions of the grid, and the number of subdivisions.
uint cDiv = primitiveIndex / subdivisions;
uint cRow = primitiveIndex / ( dimensions.x * subdivisions );
uint cellIndex = cDiv - ( primitiveRow - cRow ) * ( dimensions.x / subdivisions );
return cellIndex;
}

We utilize the vertex ID to calculate the corresponding primitive, which in turn helps us determine the associated point and cell index. By accessing the cell buffer data, we retrieve the region index for further processing.

Here, the process of forming triangles from vertices and the concept of vertex winding is depicted. Each triangle represents a basic unit of our 3D geometry, formed by three vertices. The winding order of these vertices — clockwise or counterclockwise — is crucial in determining the triangle’s orientation and visibility.

Recognizing the Fabric: Edge Detection and Region Separation

We employ edge detection on our grid to effectively separate the embroidered regions. This technique plays a vital role in analyzing the grid’s structure and accurately distinguishing the boundaries between different embroidered regions.

This illustration visualizes the Von Neumann and Moore neighborhoods. A Von Neumann neighborhood includes cells in four directions: up, down, left, and right of a central cell, while a Moore neighborhood expands this to include diagonal cells as well.

By understanding the neighborhoods within our grid, we can explore the concept of edge detection. This process enables us to identify the boundaries that separate regions in the grid, revealing distinct edges between cells. It is akin to tracing the contours of a fabric, gradually revealing the distinct regions within the overall structure of the grid.

We identify boundaries between regions during the vertex stage. We store each of the 8 possible detected edges as boolean values in a bitmask for sending to fragment shader uninterpolated.

We employ edge detection in both the vertex shader for rendering region separation and during preprocessing to assign the region index for each cell.

This multi-panel sequence showcases our implementation of the flood-fill algorithm, it travels the grid in scanlines storing unvisited neighboring cells of identical colors.

To assign region indexes to cells, we utilize a ‘Flood Fill’ algorithm that detects identical colors. Similar to how an artist fills colors within the outlines of a sketch, our algorithm fills the detected regions, effectively separating them from one another.

Stitching the Surface: Utilizing Textures and Depth Buffer

To achieve the rendering of diverse embroidery styles efficiently, we employ texture arrays. These arrays provide us with convenient global access to all the textures within our shaders, allowing seamless integration of different embroidery styles in a single draw call.

Depth maps representing the height of stitched patterns in embroidery. The white areas indicate peaks, while the black areas indicate valleys, contributing to the realistic three-dimensional appearance of the embroidery.

To capture various stitch styles, we crafted multiple depth maps using tools like Blender and Substance Designer. These depth maps allow us to render intricate embroidery.

Texture array with different embroidery styles, demonstrating the wide variety of textures available, which are indexed and retrieved through our texture array.

Using our texture array, we create a wide range of options for artists by indexing and retrieving textures. In our fragment shader, we utilize the shared texture array to gain global access to these textures. They serve as the foundation for crafting intricate stitch surfaces on top of the generated geometry.

Our utilization of depth buffers in rendering stitch intersections. It depicts the role of depth textures (SV_Depth) within the fragment shading process, adding depth and dimension to our game’s intricate textures.

Depth buffers are essential in our rendering pipeline as they ensure accurate object occlusion. By leveraging the SV_Depth semantic variables within our shader, we harness the capabilities of depth buffers. These buffers allow us to combine depth fields from our texture array with surface data like normals, enabling us to seamlessly integrate and weave together these crucial elements.

We composite a sampled texture surface with a procedural beveled surface to produce a surface with normals and depth.

By combining sampled depth maps from textures with a beveled surface generated through edge detection analysis of the mesh, we achieve a visually captivating result in our embroidery rendering. The sampled depth maps add realistic depth and shading, while the beveled surface ensures precise separation between embroidery regions. This integration enables us to create artwork with a sense of three-dimensionality and depth, enhancing the immersive and visually stunning scenes that enrich the interactive experience for players.

The Rendering Process: Lighting and Probes

In any graphics rendering pipeline, lighting plays a crucial role in creating a sense of realism. When it comes to real-time embroidery rendering in games, this becomes even more significant, considering the intricacies of the embroidery fabric’s surface and how light interacts with it. In this chapter, we are going to discuss the core shader library that we have developed specifically for this purpose.

The rendered embroidery artwork, showcasing the seamless integration of textures, shading, and depth.

The Embroidery_Lighting shader is a key component of our game’s visual realism, enhancing the embroidery’s surface with depth and vibrancy. It utilizes various utility libraries to handle complex calculations and lighting dynamics. The shader considers properties such as color, surface depth, curvature, roughness, and material finish to calculate the final color of each embroidery pixel. This calculation takes into account the main light source and environmental light, resulting in a visually captivating embroidery surface that realistically responds to lighting. Ultimately, this enhances the overall gaming experience.

Lighting and Reflection Probes: Demonstrating our approach to lighting, this 3D diagram showcases our use of Image-based lighting (IBL), delivering realistic lighting effects in the game world.

Illumination, we discover, is not just about casting light — it is about painting a realistic picture of how light interacts with our embroidered landscapes. For this reason, we chose to base the majority of our shading on image-based lighting using a reflection probe. We analyze our mesh to create a beveled surface around moves and regions, then we combine it with sampled surface data stored in our texture array.

Sampling image-based lighting enables us to achieve realistic reflections and lighting effects, enhancing the overall visual fidelity and adding a touch of realism to the embroidered artwork.

This lighting shader gives life to the embroidery in our game. It reflects the light off the tiny stitches, the curvature of the threads, the depth of the patterns, and the metallic or matte finish of the surfaces. By capturing the intricate interactions between light and embroidery, this shader library makes a significant contribution to creating a realistic and visually stunning gaming experience.

Artistic Tapestry: A Journey of Embroidery Rendering

A final screenshot captured within stitch. brought to life in real-time rendering.

In conclusion, our foray into the intricate art of embroidery rendering within the iOS game stitch. has been a captivating exploration at the crossroads of creativity and technology. By combining the power of Unity, shader coding, and procedural rendering techniques, we have unlocked a world of possibilities in translating visions into digital reality.

With each stitch meticulously rendered, we invite you to embark on your own embroidery adventure, where the beauty of the virtual world intertwines with the intricacy of the art form. Let your creativity unfold and stitch together a masterpiece within the realms of stitch.

--

--