Why it matters: Prior steps in Nvidia's DLSS toolchain have used AI to reconstruct pixels and generate new frames. Now, the company uses the same technology to approximate rays in ray-traced games, thus improving image quality and efficiency. The new functionality supports all RTX GPUs.
This week, Nvidia released DLSS 3.5, unveiling a new feature called Ray Reconstruction aimed at improving denoising, a crucial phase in the ray tracing process. This technique enhances lighting with a minor performance boost or impact, depending on the specific game.
Emitting rays for each pixel is too computationally intensive in applications incorporating ray or path tracing. Therefore, they only shoot enough to approximate each frame. However, this technique occasionally causes a spotty or noisy appearance.
Numerous denoising techniques strive to fill the gaps but introduce new flaws, which could lead to ghosting or omit specific effects like ambient occlusion. Furthermore, upscaling – usually necessary to lessen the extreme performance cost of ray tracing – can interfere with denoising.
Ray Reconstruction combines a game's disparate denoising techniques into a unified step, working with the upscaling process instead of against it to provide more comprehensive ray tracing. With the new technology, games with many ray tracing features, like Cyberpunk 2077, could see slightly higher frame rates. However, titles with comparatively light RT implementations may suffer a minor performance drop.
Nvidia's new feature supports RTX 2000, 3000, and 4000 graphics card lines. It will debut later this year in Cyberpunk 2077, Alan Wake II, Portal RTX, Chaos Vantage, and D5 Renderer. The company will unveil Alan Wake II's use of path tracing in a video demonstration of Ray Reconstruction on August 23. The technology will also feature in an in-development path-traced remaster of Half-Life 2.
Another product Nvidia introduced this week is NeMo SteerLM, a toolchain allowing developers to utilize the company's Avatar Cloud Engine (ACE) AI models. Nvidia demoed the ACE large language model at Computex by showing a virtual character having a dynamic conversation with a user.
NeMo SteerLM allows game NPCs to deliver appropriate responses based on individual attributes. They can also fluidly react to changes in the story and the world. There's no information yet on what developers or games could eventually utilize the toolchain, but mods have tried to apply the same fundamental concept to titles like Mount & Blade.