So in the real world, light obeys all kinds of laws of physics. Photons, which are particles and waves simultaneously somehow, are emitted from a light source, travel in straight lines until they encounter some matter then they either bounce off, are absorbed and re-emitted. Our eyes fairly precisely detect the number and wavelength of photons coming from the direction we are looking, which allow us to glean information about what objects are out in the world.
Simulating that with a computer takes a lot of math, because you would have to simulate the paths of a LOT of photons. For a very long time, computers, especially ones consumers could afford, just couldn’t do that, especially not in real time for video game graphics.
So through the 90’s and 2000’s, video game developers developed shortcuts for creating reasonable approximations of lighting effects. These are a pain to figure out how to do but they look reasonable and run much faster than trying to do the lighting physics. By and by graphics cards started coming with circuitry specifically to pull off these shortcuts, and small programs designed to run on graphics cards to apply these effects to graphics are called “shaders.” You may have heard that term if you’ve been around gaming for awhile.
Ray Tracing is the technique of doing the actual optical phyiscs problem to render the graphics instead of using those shortcuts. Like I said earlier, there is a lot more math involved here, but since you’re simulating the laws of physics you can get much more realistic lighting effects this way.
Things like Pixar movies or Final Fantasy: The Spirits Within used ray tracing techniques for rendering the animation in the movie with realistic lighting, but these took minutes or even hours to render a single frame. It’s also how the graphics in Myst and Riven were made, during production they ray traced the graphics then stored the results as pictures which a home computer of the time could easily display.
More recently, starting with Nvidia’s RTX-2000 series graphics cards, publicly available hardware is capable of doing all that math in real time, allowing for video games to have very realistic lighting drawn by the game engine in real time. This promises two things:
Better or more realistic lighting effects than possible with shaders. Things like shadows falling on your character’s gun, or everything in the environment that glows casting pools of light and shadows. This has been realized to a point, though there is still more computations to do so it does run slower, when you turn ray tracing on it usually comes with a decrease in frame rate.
Easier development. I’m not sure this has actually been achieved yet, but theoretically once your game engine has ray traced lighting effects built into it, you should be able to design your scene, populate it with objects and light sources, and it should just work. Problem is there are still so many graphics cards out there in use that either outright can’t run real time ray tracing or do so very poorly that they still have to use the older shader approach, so in practice it has actually complicated not simplified game design.
Reflections are a great example of this. Real, calculated, reflections are a relatively new concept in videogames, but lots of old videogames are able to replicate this effect by creating an “upside down” world, with duplicates of everything, and reflective surfaces are actually windows peering into the upside-down. Its an OK facsimile, but requires specific conditions to be met and can’t be applied broadly
I would like to add that the way movies use raytracing (usually called path tracing in that context) is very different from how games use raytracing. While animated movies will simulate every ray of light to create the entire image, games use raytracing typically only for reflections and global illumination, while the rest of the image is still rendered using traditional techniques. (I’m no expert, though I have spent a bunch of time using Blender and playing around with Minecraft Raytracing mods)
So in the real world, light obeys all kinds of laws of physics. Photons, which are particles and waves simultaneously somehow, are emitted from a light source, travel in straight lines until they encounter some matter then they either bounce off, are absorbed and re-emitted. Our eyes fairly precisely detect the number and wavelength of photons coming from the direction we are looking, which allow us to glean information about what objects are out in the world.
Simulating that with a computer takes a lot of math, because you would have to simulate the paths of a LOT of photons. For a very long time, computers, especially ones consumers could afford, just couldn’t do that, especially not in real time for video game graphics.
So through the 90’s and 2000’s, video game developers developed shortcuts for creating reasonable approximations of lighting effects. These are a pain to figure out how to do but they look reasonable and run much faster than trying to do the lighting physics. By and by graphics cards started coming with circuitry specifically to pull off these shortcuts, and small programs designed to run on graphics cards to apply these effects to graphics are called “shaders.” You may have heard that term if you’ve been around gaming for awhile.
Ray Tracing is the technique of doing the actual optical phyiscs problem to render the graphics instead of using those shortcuts. Like I said earlier, there is a lot more math involved here, but since you’re simulating the laws of physics you can get much more realistic lighting effects this way.
Things like Pixar movies or Final Fantasy: The Spirits Within used ray tracing techniques for rendering the animation in the movie with realistic lighting, but these took minutes or even hours to render a single frame. It’s also how the graphics in Myst and Riven were made, during production they ray traced the graphics then stored the results as pictures which a home computer of the time could easily display.
More recently, starting with Nvidia’s RTX-2000 series graphics cards, publicly available hardware is capable of doing all that math in real time, allowing for video games to have very realistic lighting drawn by the game engine in real time. This promises two things:
Better or more realistic lighting effects than possible with shaders. Things like shadows falling on your character’s gun, or everything in the environment that glows casting pools of light and shadows. This has been realized to a point, though there is still more computations to do so it does run slower, when you turn ray tracing on it usually comes with a decrease in frame rate.
Easier development. I’m not sure this has actually been achieved yet, but theoretically once your game engine has ray traced lighting effects built into it, you should be able to design your scene, populate it with objects and light sources, and it should just work. Problem is there are still so many graphics cards out there in use that either outright can’t run real time ray tracing or do so very poorly that they still have to use the older shader approach, so in practice it has actually complicated not simplified game design.
Reflections are a great example of this. Real, calculated, reflections are a relatively new concept in videogames, but lots of old videogames are able to replicate this effect by creating an “upside down” world, with duplicates of everything, and reflective surfaces are actually windows peering into the upside-down. Its an OK facsimile, but requires specific conditions to be met and can’t be applied broadly
I would like to add that the way movies use raytracing (usually called path tracing in that context) is very different from how games use raytracing. While animated movies will simulate every ray of light to create the entire image, games use raytracing typically only for reflections and global illumination, while the rest of the image is still rendered using traditional techniques. (I’m no expert, though I have spent a bunch of time using Blender and playing around with Minecraft Raytracing mods)