Ray tracing is a lighting technique that brings an extra level of realism to games. It emulates the way light reflects and refracts in the real world, providing a more believable environment than what’s typically seen using the static lighting in more traditional games.
A good graphics card can use ray tracing to enhance immersion, but not all GPUs can handle this technique. Nvidia and AMD have introduced their respective graphic cards which can render ray tracing in real-time — which is a humongous task, as a simple graphics card takes almost 2 days to render a 2-second scene with ray tracing!
At first, it might not be distinguishable from a normal real-time non-ray traced video, but it does show its true power when it comes to varying climates, a huge number of reflecting surfaces and some textures. It takes gigantic computational power to render 60 frames per second ray-traced video in real-time, as it has to compute EVERY single ray from the source, reverse-engineers it to the actual light source and then reproduce it.
Ray tracing is not a new technique, it has been around for a pretty long time and has been extensively used in countless animated movies — not in real-time, though. Some of these renders have taken almost a year! (Some Pixar movies, even with supercomputers). Comparing to these, companies like Nvidia and AMD have come a long way in terms of rendering speed. It's an easy concept to understand but a very difficult method to implement.
How does ray tracing work?
In real life, light comes to you. Waves made up of countless little photons shoot out of a light source, bounce across and through a variety of surfaces, and reaches your retina. Your brain then interprets all these different rays of light as one complete picture.
Ray tracing functions nearly the same way, except that everything generally moves in the opposite direction. Inside the software, ray-traced light begins at the viewer (from the camera lens, essentially) and moves outward, plotting a path that bounces across multiple objects, sometimes even taking on their colour and reflective properties, until the software determines the appropriate light source(s) that would affect that particular ray. This technique of simulating vision backwards is far more efficient for a computer to handle than trying to trace the rays from the light source. After all, the only light paths that need to be rendered are the ones that fit into the user’s field of view. It takes far less computing power to display what’s in front of you than it would to render the rays emitted from all sources of light in a scene. Digging deeper…
To understand just how ray tracing’s revolutionary lighting system works, we need to step back and understand how games previously rendered light and what needs to be emulated for a photo-realistic experience.
Games without ray tracing rely on static “baked in” lighting. Developers place light sources within an environment that emit light evenly across any given view. Moreover, virtual models like NPCs and objects don’t contain any information about any other model, requiring the GPU to calculate light behaviour during the rendering process. Surface textures can reflect light to mimic shininess, but only light emitted from a static source.
Overall, the GPU’s evolution has helped this process become more realistic in appearance over the years, but games still aren’t photo-realistic in terms of real-world reflection, refraction, and general illumination. To accomplish this, the GPU needs the ability to trace virtual rays of light.
In the real world, visible light is a small part of the electromagnetic radiation family perceived by the human eye. It contains photons that behave both as a particle and as a wave. Photons have no real size or shape — they can only be created or destroyed.
That said, light could be identified as a stream of photons. The more photons you have, the brighter the perceived light. Reflection occurs when photons bounce off a surface. Refraction occurs when photons — which travel in a straight line — pass through a transparent substance and the line is redirected, or “bent.” Destroyed photons can be perceived as “absorbed.”
Ray tracing in games attempts to emulate the way light works in the real world. It traces the path of simulated light by tracking millions of virtual photons. The brighter the light, the more virtual photons the GPU must calculate, and the more surfaces it will reflect, refract, and scatter off and from.
Like any movie or TV show, scenes in CGI animation are typically “shot” using different angles. For each frame, you can move a camera to capture the action, zoom in, zoom out, or pan an entire area. And like animation, you must manipulate everything on a frame-by-frame basis to emulate movement. Piece all the footage together and you have a flowing story.
In games, you control a single camera that’s always in motion and always changing the viewpoint, especially in fast-paced games. In both CGI and ray-traced games, the GPU not only must calculate how light reflects and refracts in any given scene, but it also must calculate how it’s captured by the lens — your viewpoint. For games, that’s an enormous amount of computational work for a single PC or console.
Unfortunately, we still don’t have consumer-level PCs that can truly render ray-traced graphics at high frame rates. Instead, we now have hardware that can cheat effectively.
It is not humanely possible to render each and every one of the billions of photons that are spread across the surface, rather than that we use a simple yet clever method to ease the processing. Rather than try to map out every single ray of light, the solution is to trace only a select number of the most important rays, then use machine learning algorithms to fill in the gaps and smooth everything out. It’s a process called “denoising.”
Rather than shooting hundreds or thousands of rays per pixels, we’ll actually shoot a few or maybe a few dozen. So we use different classes of denoisers to assemble the final image.
Ray tracing’s fundamental similarity to real life makes it an extremely realistic 3D rendering technique, even making blocky games like Minecraft look near photo-realistic in the right conditions. There’s just one problem: It’s extremely hard to simulate. Recreating the way light works in the real world is complicated and resource-intensive, requiring masses of computing power.
That’s why existing ray-tracing options in games, like Nvidia’s RTX-driven ray tracing, aren’t true to life. They’re not true ray tracing, whereby every point of light is simulated. Instead, the GPU “cheats” by using several smart approximations to deliver something close to the same visual effect, but without being quite as taxing on the hardware. This will likely change in future GPU generations, but for now, this is a step in the right direction.
Most ray tracing games now use a combination of traditional lighting techniques, typically called rasterization, and ray tracing on specific surfaces such as reflective puddles and metalwork. Battlefield V is a great example of that. You see the reflection of troops in water, the reflection of terrain on aeroplanes, and the reflection of explosions across a car’s paint. It’s possible to show reflections in modern 3D engines, but not at the level of detail shown in games like Battlefield V where ray tracing is enabled.
Ray tracing can also be leveraged for shadows to make them more dynamic and realistic looking. You’ll see that used to great effect in Shadow of the Tomb Raider.
Thus, we see that ray tracing is becoming a must have in today’s world, especially because a vast majority of the population are gaming fanatics. Ray tracing is a technology that every gamer is looking forward to. It has the potential to change the overall look and feel of gaming. Hopefully, in the near future, we will find every game ray tracing enabled!
HAVE A GREAT DAY!