WebGL — 2.0 Ray Tracing

Sushindhran Harikrishnan
Neosavvy Labs
Published in
5 min readApr 6, 2017
Sphere — Ray Tracing

We’re going to build on the simple vertex and fragment shaders from the last WebGL post and do something more exciting this time with ray tracing. What is ray tracing? In simple terms, a ray tracing program reproduces the path of light rays in the reverse direction from the camera back to the origin of the rays. This article gives a good overview of how ray tracing works.

So let’s draw a sphere on our screen today using ray tracing. We will define three structures — one each for the sphere, the ray and the light source.

struct Sphere {
vec3 center;
float radius;
vec3 color;
};

struct Ray {
vec3 origin;
vec3 direction;
};

struct Light {
vec3 position;
float ambience;
vec3 specular;
vec3 diffuse;
};
Sphere spheres[1];
Ray rays[1];
Light light[1];

1. Initialize Scene

Now let’s initialize our scene by defining our sphere, the ray and the light source. We’re creating a red sphere with a radius of 0.7 with it’s center at the origin of the x, y and z axes.

spheres[0].center = vec3(0.0, 0.0, 0.0);
spheres[0].radius = 0.7;
spheres[0].color = vec3(1.0, 0.0, 0.0);

Let’s define our light ray with it’s point of origin at (0,0,2), where 2 will be our focal length. The direction of the ray would be the normalized vector (unit vector) for (x-0.5, 0.5-y, -focalLength) assuming x and y ranges from -0.5 to 0.5 on our screen. Note that the the z value is -focalLength for the direction of the ray that is being traced.

rays[0].origin = vec3(0.0, 0.0, focalLength);
rays[0].direction = normalize(vec3(x-0.5, 0.5-y, -focalLength));

For our light source, we are only going to define two properties — position and ambience. We are getting the x and y values for the light position from the cursor. I will explain how we’re setting that up with our shaders in a bit.

light[0].position = vec3(uCursor.x, -uCursor.y, 0.9);
light[0].ambience = 0.01;

2. Intersection points for the Ray and the Sphere

The formula for getting the intersection points for the sphere and the ray is derived by substituting the x, y and z components of the ray into the equation for the sphere. For more details, look at this article here. I will try and explain the derivation for this in detail in a separate post, but we will essentially have to solve this quadratic equation

A = rayDirection • rayDirection

B = 2 * rayDirection • (rayOrigin - sphereCenter)

C = (rayOrigin-sphereCenter) • (rayOrigin-sphereCenter)- radius²

where represents the dot product and the quadratic equation is:

t = (-B ± √B²–4AC) / 2A

A will be 1.0 since we normalized the ray direction which will be a unit vector. So we arrive at

t = (-B ± √B²–4C) / 2

Here is how that looks in code.

vec3 sphereCenter = sphere.center;
vec3 colorOfSphere = sphere.color;
float radius = sphere.radius;
vec3 cameraSource = ray.origin;
vec3 cameraDirection = ray.direction;
vec3 lightSource = light.position;
float ambience = light.ambience;
vec3 color = vec3(0.0, 0.0, 0.0);

vec3 distanceFromCenter = (cameraSource - sphereCenter);
float B = 2.0 * dot(cameraDirection, distanceFromCenter);
float C = dot(distanceFromCenter, distanceFromCenter) - pow(radius, 2.0);
float delta = pow(B, 2.0) - 4.0 * C;

The intersection point is the the bigger of the two roots of the quadratic equation. So we have two scenarios — real and equal roots and real and unequal roots. For unequal roots, the value of t will be the greater of the two roots.

float t = 0.0;
if (delta > 0.0) {
float sqRoot = sqrt(delta);
float t1 = (-B + sqRoot) / 2.0;
float t2 = (-B - sqRoot) / 2.0;
t = min(t1, t2);
}
if (delta == 0.0) {
t = -B / 2.0;
}

3. Lambert Shading

We are going to treat our sphere as a Lambertian surface, which means that all incoming light is scattered/diffused equally in all directions. Lambertian reflectance is the resultant vector of the surface normal at any given surface point on the sphere and the direction of the light at that point.

The surface point is a projection on the screen from the camera source/ray origin in a direction. So the surface point can be calculated from the ray equation as follows.

vec3 surfacePoint = cameraSource + (t * cameraDirection);

The surface normal at a point on the sphere is just the unit vector from the center of the sphere to that point.

vec3 surfaceNormal = normalize(surfacePoint - sphereCenter);

So now we have everything we need to calculate Lambertian Reflectance.

max(0.0, dot(surfaceNormal, lightSource));

There usually is a lot of ambient light in a scene. We can account for that by doing this. Ambience is usually a very low value.

(ambience + ((1.0 - ambience) * max(0.0, dot(surfaceNormal, lightSource))));

We can now multiply this value with the color of our sphere to get our final color to set as the value for gl_FragColor.

color = colorOfSphere + (ambience + ((1.0 - ambience) * max(0.0, dot(surfaceNormal, lightSource))));

4. Hooking up mouse events to our shader

I tweaked the gl_utils.js file that we used in our first shader lesson to add mouse events. I added this line to the gl_init function, similar to how we define uTime.

gl.uCursor = gl.getUniformLocation(program, “uCursor”);

We also need to add mouse event listeners to our canvas. I added the following to the start_gl function

function setMouse(z) {
var r = event.target.getBoundingClientRect();
gl.cursor.x = (event.clientX - r.left ) / (r.right - r.left) * 2 - 1;
gl.cursor.y = (event.clientY - r.bottom) / (r.top - r.bottom) * 2 - 1;
if (z !== undefined)
gl.cursor.z = z;
}
canvas.onmousedown = function(event) { setMouse(1); } // On mouse down, set z to 1.
canvas.onmousemove = function(event) { setMouse() ; }
canvas.onmouseup = function(event) { setMouse(0); } // On mouse up , set z to 0
gl.cursor = new Vector3();

Lastly, I added this line to the gl_update function to pass the updated cursor values to the shaders.

gl.uniform3f(gl.uCursor, gl.cursor.x, gl.cursor.y, gl.cursor.z); // Set cursor uniform variable.

Now we should be all set to fire up our ray tracing program to render a sphere. Click here to see a running version of this. You can check out and download the code from our github repository here.

We will do more ray tracing next time with different shapes and I’ll introduce the phong algorithm along with refections and collisions between these moving shapes.

--

--