Side Of Ray Tracing

Bob Duffy
SideOfCyber
Published in
8 min readJan 16, 2020

Ray tracing is a term thrown about these days, especially for advanced computer graphics in video games and film. But what is ray tracing? In this post, I attempt to provide clarity. First, let's talk turkeys.

Have you ever traced your hand in pre-school? In the United States, it's common, around our Thanksgiving holiday. Kids trace their hand on a piece of paper to create an outline of a turkey then color it in, and parents proudly post on kitchen refrigerators. After having created many hand-traced turkeys I questioned, how does one draw the shape of a hand to look like a hand. I saw amazing hands drawn by artists like M C Escher and wondered, how did he do that? One day I had an idea.. what if I didn’t trace my hand where it touched the paper? What if I closed one eye, and traced where it looked like the edges of my hand met the paper below. It was amazing. Suddenly I had drawn the shape of a hand that looked realistic by tracing the path of light from my eye to the paper rather than tracing the hand.

I didn’t know it at the time but I stumbled on a perspective tracing technique used by artists across the centuries, which is at the heart of what we call ray tracing. In the 16th century one innovative artist, Albrecht Düre created a “Perspective Machine” to help artists draw perspectives accurately. He did this by creating a screened 2D frame, between the artist and the drawing subject. The artist would then establish a line of sight from the artist’s eye through the 2D screen to any part of the drawing subject. In front of him was his drawing paper with a matching grid.

“Perspective Machine” illustration published in Albrecht Dürer’s The Painter’s Manual, 1525.

When using the machine, the artist positions his eye in a fixed location and looks through a portion of the screened grid focusing on just what he sees in one cell of the grid. He then draws that portion of a grid to the corresponding grid on his paper. This technique for rendering an image by tracing the path of light through cells of a 2D image plane is called ray casting or ray tracing, and it’s how today’s advanced computer graphics got its start.

This set up may look strange but take a closer look. Look at the artist gazing through this gridded viewport. For fun, let’s call each cell of that grid a “pixel”. Now imagine watching an animated 3D movie on a computer screen, where each pixel on your screen is a direct dot of color, light, or shadow out to the 3D world beyond the TV screen. Would it not look an awful lot like this ancient mechanism?

Imagine this mechanism as a computer. What if we continued the line of sight from the artist’s eyes, like a ray, through to the subject. Now imagine this ray doesn't stop when it hits any surface. What if the ray bounced off the subject in front of the artist to other parts of the scene? And then do this for every position or pixel on the 2D grid. What we have are imaginary rays from the artist’s eyes, bouncing to anything the artist can see in the scene and to things that might be reflected or refracted.

This is mimicking the physics of optics but in reverse. In our world, light rays start from the environment, bounce of objects, then land in our eye. For rendering, we project rays in reverse, because we only need the rays that land in the camera. And because ray tracing mimics the physics of light and optics, ray-traced graphics can look highly realistic.

In the world of computer graphics, Düre’s mechanism is recreated in a digital 3D space with a virtual camera as the artist’s fixed eye position. The virtual camera establishes a 2D viewing plane in front of it, similar to Albrecht Düre’s 2D viewing plane. The virtual plane’s resolution is set to match the desired render in pixels. A 1920x1080 render has a virtual Düre grid of 1920x1080

virtual camera with grid viewport pointing at virtual 3D objects. Source: SideOfCyber Blog by Bob Duffy

For each pixel location in the virtual viewing grid, simulated light rays are shot out toward the virtual scene. They bounce off virtual objects hitting other objects and sometimes hitting sources of light. Rays not hitting a light source or lightmap are cast in shadow. If an object is translucent then the ray will pass and bend through the objects to objects behind it. Information on lights, shadows, as well as color and surface properties of each virtual object bounced along the path, defines the color and luminance of the pixel.

Illustration of the ray-tracing algorithm for one pixel (up to the first bounce) Source: https://en.wikipedia.org/wiki/Ray_tracing_(graphics)

Beautifully realistic images are rendered with this method. If you’ve ever watched Pixar’s Cars or most animated 3D movies since 2006, you’ve seen ray-tracing. The entire film was created using ray-tracing software. However, each frame of the film took 15 hours to render.

Source: “Ray Tracing for the Movie ‘Cars’” by Pixar https://graphics.pixar.com/library/RayTracingCars/paper.pdf

To render images like this, virtual objects need to be crafted, from a geometric shape to physical material properties called “shaders” that mimic these optical properties such as color, roughness, metalness, specularity, emission, or translucency and refraction. Digital artists spend countless hours creating these virtual objects with these surface features. Additionally, lighting or environmental lightmaps are created to mimic sources of light. When done well, it is possible to create renders indistinguishable from real-world photography. But even today, it takes time to render each frame.

Material shaders rendered in Blender using the Cycles ray tracing engine. Source: “SideOfCyber” Blog by Bob Duffy

Real-time ray-tracing is the holy grail for computer graphics and we are getting closer. Actually, in 1986, only 7 years after the first paper on computer graphics for ray tracing was shown at Siggraph, the Commodore Amiga personal computer demonstrated a real-time ray tracing demo called “The Juggler”. I remember being on University Avenue in Palo Alto in 1986 and seeing a demo of this in the store window. I had to have this computer. I saved money and started my journey down the road of 3D graphics.

1986 “The Juggler” real-time ray-tracing demo for the Commodore Amiga, by Eric Graham, written in SSG, a precursor to Sculpt 3D

Thirty-four years later, you’d think we have high-quality real-time raytracing, but unfortunately, we do not. We still have far to go. But with advanced hardware, we are seeing a hybrid version of ray tracing in video games. This real-time version of ray tracing uses a combination of Rasterization and Ray Tracing. Rasterization doesn’t bother calculating the many potential rays of light needed to create a 3D image. It creates an image based on the geometry and light sources alone. It does a fantastic job. However, for reflective surfaces, translucent surfaces, or accurate shadows, rasterization doesn’t come close to what ray tracing can do.

Left: Ray tracing render using Blender’s Cycles renderer. Right: Rasterization using Eevee renderer. Source: “SideOfCyber” Blog by Bob Duffy

Thus for hybrid real-time ray tracing most pixels can be rendered via rasterization. However, where accurate shadows, reflections, and refractions are needed, ray tracing can fill in the gaps. This dramatically reduces the number of rays needed for rendering. The results are real-time 3D scenes close to the quality of fully ray-traced scenes and a fantastic boost to realism in video games. For real-time ray tracing in video games, you’ll need advanced hardware. For gamers wanting that extra dose of reality, it may be a worthy investment. More on that later.

For artists and designers, the good news is there’s a lot of excellent software out for ray tracing which can run on nearly any modern PC using either your CPU or GPU. Depending on your system’s performance it may take a bit of time to render but you can do it.

Ray tracing opens artists, designers, and developers into a world driven by imagination, where one can create and render anything to look as real as a photograph. Open-source software called Blender gives access to students, hobbyists and design studios professional-grade modeling and animation tools, including use in feature films. Maxon’s Cinema 4D software is an industry-standard for agencies and VFX artists around the globe. V-Ray a commercial-grade renderer available as a plug-in for most popular 3D engines.

New worlds of light and shadow are opened up with this technology. Below is a sample video created by VFX artist and friend of mine, Marc Potocnik of Renderbaron called “Apollinarisstr”, named after a street in his home town. Marc uses Maxon’s Cinema 4D software to create videos like this.

“Apollinarisstr” by Marc Potocnik of Renderbaron

Marc is an Intel Software Innovator, Maxon partner, and a true digital artisan often using complex procedural techniques to create highly organic-looking textures. To learn more about the artistry and creative process that goes into creating work like this, watch this presentation from his FMX 2019 session “Apollinarisstr. — Ingredients to Realism” watch here.

But if you are interested in real-time ray tracing, the news is getting better every year. Microsoft’s DirectX 12 now supports real-time ray tracing in a variety of GPUs that originally were not designed for it. Game engines such as Unreal Engine and Unity have rendering pipelines for DirectX supported ray tracing. This opens up higher quality more realistic environments with indirect light, shadows, and reflections to game devs, architects, and filmmakers.

And chances are if you have a modern desktop GPU you can play games with real-time raytracing. More and more game titles are supporting, but not all games do this equally. So you’ll need to do your homework to understand what you are really getting. Some games may only support simple reflections while rasterizing most everything else, in order to maximize performance across a variety of GPU hardware. Thus, for now, not all “ray tracing” in games means the same thing.

And recall the topic of material shaders. A game or scene is only as good as the artists' work to make the materials realistic. Turning on ray tracing does not create realism, it renders what the game makers and artists crafted in a scene. Thus the ray tracing is only as good as the work the artists and developers put into making the game or scene render well for ray tracing.

Eventually, the hardware, software, and pipeline will catch up and the day will come where fully ray-traced environments are rendered in real-time across common platforms.

Meanwhile, give the old hand tracing thing a try and think back to Albrecht Dürer’s perspective machine and how tracing the path of light allow artists and computers to render with realism

--

--

Bob Duffy
SideOfCyber

Techno-nerd generalist: 80s-90s coder & artists, dot com era eCommerce dev , now running Intel’s Software Innovator Program and spending free time in Blender 3D