Get ready for the ‘holy grail’ of computer graphics

Get ready for the ‘holy grail’ of computer graphics


Ray tracing has always been the “holy grail” of computer graphics, says Jason Ronald, head of program management for the gaming console Xbox.

The technique simulates a three-dimensional image by calculating each ray of light and promises stunning lighting effects with realistic reflections and shadows.

The method finds where it bounces, collects information about what those objects are, then uses it to lay down a pixel and compose a scene.

While the techniques have been around for a long time, “we just haven’t had processing power to deliver all of that in real time”, Mr Ronald says.

In Hollywood, special effects have used ray tracing for a decade. For an important sequence, computers could churn overnight for a single frame.

To do this for games in real time, you need to condense that to 1/60th of a second. Processing speed has now caught up to the task.

Tech company Nvidia announced last year their latest graphics processing units (GPUs) will handle real-time ray tracing.

Working with Nvidia, Microsoft developed a ray tracing update to Windows 10.

Microsoft and Sony have announced their upcoming consoles, the Xbox Series X and the PlayStation 5, will have ray tracing capabilities. Both of these systems are built on Advanced Micro Devices (AMD) hardware.

And now the tech is being incorporated into some of the world’s most popular games.

Minecraft, which first appeared in 2009, allows players to construct vast intricate structures. Developed by Swedish game studio Mojang, it currently is the best-selling video game in history.

Minecraft’s makers released an early ray-tracing version of their game on 16 April. A general release will follow near the end of the year.

“It looks very different from the traditional rendering mode, and it looks better,” says Jarred Walton, hardware editor at PC Gamer.

The big problem, he says, will be price barriers for now. “The only way you can play it is with a PC that has a graphics card that costs at least $300 (£240),” he says.

Until now, developers used another technique called rasterisation.

It first appeared in the mid-1990s, is extremely quick, and represents 3D shapes in triangles and polygons. The one nearest the viewer determines the pixel.

Then, programmers have to employ tricks to simulate what lighting looks like. That includes lightmaps, which calculate the brightness of surfaces ahead of time, says Mr Ronald.

But these hacks have limitations. They’re static, so fall apart when you move around. For example you might zoom in on a mirror and find that your reflection has disappeared.