Ray tracing has always been the “holy grail” of computer graphics, says Jason Ronald, program manager for the Xbox game console.
The technique simulates a three-dimensional image by calculating each ray of light and promises surprising light effects with realistic reflections and shadows.
The method finds where it bounces, collects information about what those objects are, then uses it to lay down a pixel and compose a scene.
While the techniques have been around for a long time, “we didn’t have the processing power to deliver this in real time,” says Ronald.
In Hollywood, special effects have used ray tracing for a decade. For an important sequence, computers can churn out overnight for a single frame.
To do this for real-time games, you need to condense it to 1/60 of a second. The processing speed is now up to the task.
Technology company Nvidia announced last year that its latest graphics processing units (GPUs) will handle real-time ray tracing.
Working with Nvidia, Microsoft developed a ray tracing update for Windows 10.
Microsoft and Sony have announced that their upcoming consoles, Xbox Series X and PlayStation 5, will have ray traced functionality. Both of these systems are based on Advanced Micro Devices (AMD) hardware.
And now the technology is being incorporated into some of the most popular games in the world.
Minecraft, which first appeared in 2009, allows players to build vast complex structures. Developed by the Swedish game studio Mojang, it is currently the best-selling video game in history.
Minecraft producers released a ray traced version of their game on April 16. A general version will follow towards the end of the year.
“It looks very different from traditional rendering mode and looks better,” says PC Gamer hardware editor Jarred Walton.
The big problem, he says, will be price barriers for now. “The only way to play it is with a PC that has a graphics card that costs at least $ 300 (£ 240),” he says.
Until now, developers have used another technique called rasterization.
It first appeared in the mid-90s, is extremely fast and represents 3D shapes in triangles and polygons. The one closest to the viewer determines the pixel.
Hence, programmers must use tricks to simulate the appearance of lighting. This includes light maps, which calculate surface brightness in advance, says Ronald.
But these hacks have limitations. They are static, so you fall apart when you move. For example, you could zoom in on a mirror and find that your reflection has disappeared.
More business technology
Programmable shaders started appearing around 2001. They did a much better job in 3D lighting activities but required much more computational power.
“If we put all of this into one game, the world’s most amazing processor would have gone no, it’s too much,” says Ben Archard, senior rendering programmer at Malta’s 4A games, developers behind a 2019 post-apocalyptic game called Metro Exodus.
There were ways to get around this. If a programmer wanted to simulate the nebulous light that enters the fog, instead of calculating all the points, he could simply calculate a sample. (These are called stochastic, statistical or Monte Carlo approaches.)
But with these alternatives, “you quickly lose that realism in a scene,” notes Kasia Swica, senior program manager at Minecraft, based in Seattle.
Ray tracing works best with realistic shadows in real time or lurking reflections in water or glass.
“My favorite thing to do with ray tracing is to go underwater,” says Miss Swica.
“You get really realistic reflections and refractions and even clean beams of light,” he says.
With blockages worldwide due to the coronavirus pandemic, the need for people to feel close while isolated “is about to accelerate” advances in technology, says Rev Lebaredian, vice president of simulation technology in Nvidia, San Francisco.
“With virtual and augmented reality, we are starting to feel together in the same place,” he says.
So coronavirus will drive demand and progress, agrees Frank Azor, chief architect of AMD’s gaming solutions.
A “diabolical problem” for ray tracing involved how shaders can appeal to other shaders if two beams interact, says Andrew Goossen, a Microsoft technician working on the Xbox Series X.
GPUs work on problems like parallel rays: making parallel processes talk is complex.
Solving technical problems to improve ray tracing will be the main task “at least in the next 5-7 years of computer graphics,” says Ronald.
In the meantime, game companies will use other techniques to make games sharper.
Earlier this month, Epic Games, the producers of Fortnite, released its latest game architecture, the Unreal Engine 5.
He uses a combination of techniques, including a library of objects that can be imported into games such as hundreds of millions of polygons and a hierarchy of details that treat large and small objects differently to save on his processor resource needs.
For most game makers, these “tricks and tricks” will be pretty good, Walton says.