Ray Marching Metaball in Unity3D

Sehyun Av Kim
May 7, 2018 · 21 min read
Ray Marching Metabll in Unity3D

Introduction

I’ve been following ray marching sketches from Shadertoy nowadays and amused by it’s smoothness of rendering unlike using polygons. Among many sketches, this metaball example by Inigo Quilez caught my attention, simply because of its aesthetics of organic deformations of metaball, which was amplified by smooth and quality surface shading of ray marching.

Since Shadertoy is base on only fragment shader, blobs’ positions are updated in every pixels in this example. I wondered how much performance improvement that it could be made if all calculations except for shading are taken out of fragment shader and done beforehand.

But above all, I wanted to give a shot ray marching by implementing it by myself to understand how it works since this technique can be applied to other useful technique such as volume rendering, my another huge interest nowadays. I thought getting used to ray marching would be a good start.

Inigo Quilez’s Metaball on Shadertoy

1. Ray marching and Distance Field

Ray marching is a 3D rendering technique . Unlike ray tracing which uses explicit polygons and functions to find the intersection, ray marching just cast a ray from the camera to each pixel and march along the direction until it finds a close enough to the intersection.

Distance field represents the closest distance between current point of the marching ray and surface of the scene constructed by mathematical formula called distance functions. This value is used as a distance of marching step, which drastically lessen the number of marching steps also lower the overshooting compared with marching with constant step size.

left : marching with constant size step / right : marching with distance field

Ray marching technique, more specifically, ray marching with distance field technique has been popularized and grown by Inigo Quilez, who is also the author of this metaball example on Shadertoy and also a co-creator of Shadertoy. It is capable of creating high quality visuals with significantly lower size excutables than other technique, for example the image below, which is a screenshot of his real-time demo, is created in 4 kilobytes. Too see more his works, visit his demo archive.

Slisesix by Inigo Quilez from Rendering Worlds With Two Triangles in 2008

2. Metaball

If I borrow words from Wikipedia,

Metaballs are, in computer graphics, organic-looking n-dimensional objects. The technique for rendering metaballs was invented by Jim Blinn in the early 1980s.

Or some others said,

Metaball is an isosurface in 3D space (or isoline in 2D space).

Sometimes, we all find something hard to be explained by just words, don’t we? But I believe you have seen it at least once in the animations, games or motion graphics, cause it’s been around for a long time and fairly popular effects in computer graphics.

Maybe the most similar example you can find in the real life would be a Mercury.

image from Wikipedia

They act like Mercury when more than 2 metaballs are close enough as you can see in the video below. Also this is the demo I’ll break it down in this post.


Process

1. Break it down the example

a. Setup rays

black dot : ray origin / red dots : pixels on image plane / red line : ray / blue dot : center of image plane

Since ray marching needs to cast rays from camera to each pixel, we need to know Ray Origin which means where your camera at and Ray Direction which means direction toward each pixels on image plane from the ray origin. We’ll get to details by examining Inigo’s example.

Here is the part where the ray direction is setup in Inigo’s example :

Now we’ll break it down from part to part :

Note that it’s not actually a camera but just a point

Set the ray_origin (the camera) in world space. At the world root (0, 0, 0), it will be circling around Y axis along with going up and down.
The lookat_target will be used later.

Then we construct the image plane with pixel coordinates fragCoord.xy which ranges from left bottom (0. 0) to right top(window.width, window.height).

Since the each pixel’s position needs to be in world space coordinates, we scale the down to 2 units width and height and offset the origin to the center.

Now we get a directional vector from the camera toward lookat_target which we defined above by subtracting ray_origin from the lookat_target and normalize it. We can assume this directional vector as the forward vector of our camera. With this forward vector, we can get left and up vectors by cross product with perpendicular vectors. Within these 3 axis, we can construct the lookat rotation matrix to the camera. We use this to align the image plane looking toward the ray origin by translating each pixel by multiplying this matrix. The code is following :

axis on the center : visualizes the orientation of lookat matrix

Since image plane needs to belong in camera’s matrix in order to construct the field of view, following code does that :

This part needed a bit more breakdown. I wrote the process in the code above, I’ll explain step by step.

  • #1 : By giving Z value to the image plane, now the image plane is place in the world space, 2 units far away (0, 0, 2) from the root (0, 0, 0) along Z axis.
  • #2 : By multiply the lookat rotation matrix to each pixel’s position in the image plane, now the plane is always facing toward the ray origin.
  • #3 : By adding position of the ray origin, now it moves along with the ray origin while it is facing toward the ray origin 2 units far away. we can assume this is a world position of pixels on the image plane.
  • #4 : Since image_plane = ray_origin + image_plane, we can confirm that normalize(pixels_in_world_space — ray_origin) is identical with the original code normalize( pixel.x*cam_left + pixel.y*cam_up + 2.0*cam_forward )

b. March rays

This function marches ray a number of iterations to find a point close enough to the surface of the scene. I think the code explains itself so I just leave comments in the code above to briefly explain the details.

c. Distance Field

As I briefly mentioned above, distance field gives you the closest distance between your current ray’s position and the scene. In this example, we only render the metabll so the distance filed function is equivalent with distance function of metaball as you see in the code below, so we move on to the distance function of metaball.

*If you curious to see how to construct a distance field with various distance functions, I recommend to read Modeling with Distance Functions by Inigo Quilez.

d. Render Metaball

Following is the distance function of metaball in this example :

*I should mention that the metaball part break down is based on my quick research while I was writing this post, so I had to rely on the assumption on many parts. So please kindly understand and please do not hesitate to point them out for me.

So… let me take a deep breath... and here we go.

We can divide this metaball function into 2 big part :

  • Bounding sphere
  • Metaball field

d-1. Bounding sphere

For the performance reason, we ignore the calculation for metaball field until the ray hits any of bounding sphere in the group of blobs. The radius of bounding spheres are defined by each blob’s radius blobs[i].w

Let me give more detail on the reason for adding 0.1 to the closest distance. For the simplicity and better understanding, let’s assume we only have 2 blobs in 2D space. See the image below :

Let’s say the ray is not almost close to the surface of Blob P1. It will most likely get closer to next iteration, maybe it might get smaller than precision = 0.01value in the ray marching function. That means, the ray marching function will stop the ray and will return current distance which is the closest distance between ray and bounding sphere (not a metaball). We only need bounding spheres as a entry point for the ray into the metaball field (let’s just say inside of the blob) so we add up just enough number to fool the marching function and let the ray jump into the bounding sphere. If we zoom into the image :

The darker grey curve is the edge of bounding sphere, the red dot is the current ray, the blue line is the closest distance between the ray and surface, the green line is also the closest distance applied to ray direction, and the red line is the little push d = dmin + 0.1 to let the next ray go into the bounding sphere and to prevent terminating ray marching function’s iteration. Again, without a little push, the ray marching function will just return the closest distance to bounding sphere. Note that, as soon as we jump into the bounding sphere then we don’t need this small push anymore. I’ll point it out again in next part.

So, we’re in the blob. Let’s metaball them.

d-2. Metaball field

Note that the italic lines will not be used from this point anymore because we’re inside of the bounding sphere. Like I mentioned, the little push is not used anymore as it is overwritten by this line :d = h * (th — p).

Before I directly explain the code, let’s start with the case when the ray is inside only one blob :

For the simplicity, I set the direction of ray aligned to the center of blob’s center so we don’t need to re-align the distance with ray’s direction.

Since we’re in the blob, we know db is less than blob’s radius so we can normalize it by x = db/blobs[i].w. By doing this, we can describe current distance between ray and blob’s center in relative scale 0–1. So let’s say if rays are in the middle between the center and surface of each blob then we can say they are 0.5 unit far away from the each blob’s center in each blob’s coordinates :

X = DISTANCE_BETWEEN_RAY_AND_BLOB_CENTER / BLOB_RADIUS

In graph form :

X = DISTANCE_BETWEEN_RAY_AND_BLOB_CENTER / BLOB_RADIUS

By using this normalized distance, we’ll calculate metaball’s falloff by using this inverted polynomial equation :

Note that we invert it because we need to march ray into the blob not the opposite so the distance should be measured from the surface :

P = 1. — X*X*X*(X*(X*6.0–15.0)+10.0)

Now the direction is from surface to center. Even though it’s a same scale, it’s gradient is smoothed by the equation. You can see in the graph :

X = DISTANCE_BETWEEN_RAY_AND_BLOB_CENTER / BLOB_RADIUS

So now we got a smooth curve for metaball’s falloff. Now, we need to decide metaballs’ volume with threshold and weight with following codes :

The h is the weight for metaball’s falloff and it also scales normalized distance back to world coordinates and it’s decided in the following code :

To be honest, I don’t still fully understand why it has to be multiplied with the number. Without multiplying it, I saw some artifacts on metaball’s surface. My best guess at this moment is that it’s for calibrating the metaball’s falloff from the polynomial equation, based on his comment in the original code. I’ll follow up this as soon as I have better understanding. I leave his his original comment for the reference.

The quintic polynomial p(x) = 6x^5–15x^4 + 10x^3 has zero first and second derivatives in its corners. The maxium slope p’’(x)=0 happens in the middle x=1/2, and its value is p’(1/2) = 15/8. Therefore the minimum distance to a metaball (in metaball canonical coordinates) is at least 8/15 = 0.533333

Back to deciding metaball’s volume. As we can see in the code, if the falloff is bigger than threshold th = 0.2, the distance becomes negative. It means the ray is inside of the metaball volume and also the ray marching function will return current travel distance since the distance becomes less than precision = 0.01.

If we apply this into the case if the ray is in the intersection of 2 blobs’ bounding sphere, the total p value is a sum of p1 and p2 :

In 3D space, the distance map of the metaballs from this process looks like :

e. Surface geometry and normal

Since we know the distance from ray origin and the scene surface, we can construct the virtual geometry surface based on those information by :

surface_geometry = ray_origin + distance_field * ray_direction

Now we need the surface normal in order to apply lighting techniques. With polygons, we can compute surface normal by interpolating the vertex normals of triangle but we don’t have triangles, but only know the position(surface_geometry) of the surface from the current ray. So we need to estimate the surface normal by taking the gradient of 2 points, one just behind and the other front of current ray. We sample these 2 points with a very smaller step size and get direction from behind to front then the direction will be almost perpendicular to the position, we can assume this as surface normal. In code :

However in this example, we have for loop in metaball’s distance function to iterate blobs’ array, it’s a bit inefficient because we need to run the loop 6 more times. So Inigo uses analytic approach by using polynomial :

My best guess at this moment, metaballs’ spherical surface normal can be calculated with normalize( pos — blobs[i].xyz ). This polynomial would be used mostly for the falloff. I’ll follow up this as soon as I have a better understanding after more research as well.

2. Implementing in Unity3D

2–1. Implementing in Unity3D — Compute blobs’ attributes

Based on the example, in order to ray-march the metaball in fragment shader, the requirements are values of attributes for each blob such as position and radius before it starts rendering in fragment shader.
So I decided to use compute shader to calculate them and export them into textures, and then pass them to fragment shader.

a. Render Texture setup
I use pairs of 2 render textures in order to write/read blobs’ attributes. Dimension of the textures decides the number of blobs (texture.width * texture.height) for the simplicity. Following is my render texture setup.

Note that :

  • Why pairs?
    -
    I design them like a force based particle system, so all attributes are updated and related based on their previous values.
  • 32 bit float for ARGB channels texture format
    -
    I need 8 channels (2 x ARGB) in order to write blob’s position(float3), lifespan(float), velocity(float3), and scale(float) values in higher precision
  • Point filter
    -
    No filter interpolation so that I read correct values as I wrote.
  • EnableRandomWrite
    - In order to allow compute shader to write into arbitrary locations of textures — Unity3D compute shader documentation

b. Compute Shader setup
Compute shader has 2 kernels, one is for setup and the other is for update. As I have mentioned above, compute shader calculates blobs’ attributes and writes into 2 render textures.

The basic rules of blobs’ attributes are following :

  • These functions are used for re-initiation when blob died by calling following function. Note that the seed(_pos) is updated after reset_pos_and_life(_pos) is called.
  • After the setup, blobs’ attributes are updated in update kernel. The rules are basically :
    . Fetch the previous values of attributes from the previous textures which I mentioned above in render texture setup.
    . Velocity and life are easing-out(fast-in to slow-out) in every loop by divided by value less than 1
    . Scale is updated with jello-ish bouncing motion and after the life goes below than 1 then it starts decreasing to disappear
    . Position is updated by adding velocity to itself after the velocity is updated
    . Blobs stay in certain size of cube in order to emphasize the metaball effects by letting them touching each others
    . When they died, they are revived by calling the function I have mentioned above

c. Debug Blobs’ attributes with instance mesh
So now I have got the values of blobs’ attributes, in order to see if they works as I intended before I start ray marching with them, I renders it with instancing mesh.

okay

I use DrawMeshInstancedIndirect to render instance mesh with custom shader and the render textures I just updated. Basically, I followed Unity’s documentation to setup .cs and shader, but in this documentation, it uses Compute Buffer instead of Render Texture which I’m using. So note that looking up the values with InstanceID is slightly different. See the comparison below :

2–2. Implementing in Unity3D — Render metaball in fragment shader

a. Setup .cs script for ray marching

Unity3D provides pre-defined functions for you to access specific stage of event. Since ray marching is fragment shader based, I chose to implement it on OnRenderImage which is the last stage of screen rendering. Structure of the function is following :

The function gives you a render texture (_src), a frame buffer which contains rendering image of the scene as a texture. This will be passed to the material as a built-in uniform sampler2D _MainTex when you run Graphics.Blit and then process the ray marching shader attached on the material to output processed image into a render texture given by the function(_dst), which is the final frame buffer that will be rendered on the screen.

*Note that this .cs script needs to be attached to the camera you render.

b. Setup shader for ray marching
Move on to the ray marching. I use custom Image Effect Shader which gives you a template for you to start straight to write fragment shader code for Blit.

Basically, I just translated the code from Shadertoy(GLSL) to Unity3D(HLSL) so the basic structure is same but in slightly different syntax. So I’ll not explain entire codes but only couple things which differently implemented.

b-1. Setup rays
Remeber how complex it was to setup rays in the example? Fortunately, it’s a lot simpler in Unity3D cause it provides tons of useful predefined functions and variables for you. Following is how I setup the rays :

It’s slightly different than how we did in the example. In the example, we manually define camera’s behaviors such as filed of view and transform matrix. However, in Unity3D, we can use properties of camera object provided by the engine so we just need to undo the Matrix Transformation in opposite order.

As you can see, I use some predefined variables provided by the engine, UNITY_NEAR_CLIP_VALUE, unity_CameraInvProjection, and_WorldSpaceCameraPos. You can find more information via their official documention for shader

instance mesh cubes and ray marching spheres

To see if I set the ray correctly, I render ray marching spheres on top of instance cubes. We can see it’s aligned.

Since ray marching is happening after the mesh rendering, it overwrites the pixels on top of the frame buffer like you see above. However, it’s possible to composite the ray marching object with real mesh by using depth buffer. I haven’t tried it by myself yet in this project but you can see how it’s done in this blog post Raymarching Distance Fields: Concepts and Implementation in Unity by Adrian Biagioli.

b-2. Ray marching metaball

The distance function is same with the example but only I fetch the blob’s attributes from the texture :

b-3. Shading and Lighting

I borrow a lighting equation from Inigo’s another shadertoy sketch Angles. For blobs’ color and illumination is defined by blobs’ direction and speed. I decided to follow up shading and lighting in ray marching in another post cause this post has got already too long. In case if you are curious, I’ve attached project source in this post.


Resources

Project Source

Github

References

Ray marching metaball and surface Shading
- Raymarching metaball — Inigo Quilez’s raymarching metaball
- Raymarching surface rendering — Inigo Quilez’s Angels

Ray marching
- Raymarching Distance Field by Inigo Quilez
-
nvscene 2008 — Rendering Worlds With Two Triangles by Inigo Quilez
-
Ray Marching Distance Field by Simen Haugo
-
Distance Transform Wikipedia
-
Volumetric Rendering: Raymarching by Allan Zucconi
-
Raymarching Distance Fields: Concepts and Implementation in Unity by Adrian Biagioli
-
Ray Marching and Signed Distance Functions by Jamie Wong

Metaball
- Metaballs (also known as: Blobs) By Ryan Geiss
-
Metaball Wikipedia

Compute Shader
- Unity Compute Shader — Part 1 by Claire Blackshaw
-
Unity Compute Shader — Part 2 by Claire Blackshaw

Matrix Transformation / Camera Frustum / Ray Casting
- Ray-Tracing: Generating Camera Rays by Scratchapixel 2.0
-
Computing the Pixel Coordinates of a 3D Point by Scratchapixel 2.0
-
The Perspective and Orthographic Projection Matrix by Scratchapixel 2.0
-
Viewing Frustum Wikipedia
-
OpenGL Transformation by Song Ho Ahn
-
OpenGL Projection Matrix by Song Ho Ahn
-
Homogeneous Coordinates by Song Ho Ahn
-
OpenGL Lookat to Axis by Song Ho Ahn

Development Environment

Windows 10 / GeForce GTX 970
Unity3D 2017.4.0f1

Sehyun Av Kim

Written by

computer graphics artist / creative dev - avseoul.net

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade