Nvidia has announced a gorgeous new update to its Deep Learning Super Sampling technology. DLSS 3.5 will drop in the next few months and promises to improve both the look of a final ray traced scene as well as deliver a welcome little performance bump, too. And how is it doing this? Why, AI, of course.
The addition of Ray Reconstruction to the DLSS 3.5 pipeline isn’t going to be a ring-fenced technology—as Frame Generation is for the RTX 40-series cards—it’s going to be available to anyone using an RTX generation GPU. Though, notably DLSS itself is still platform specific, so it’s obviously still an Nvidia-only performance and fidelity boon.
Effectively what it’s doing is replacing the denoisers in the graphics pipeline with this AI-powered Ray Reconstruction element that will work alongside the traditional Super Resolution upscaling workflow.
It’s this removal of the standard denoisers which is delivering the slight performance increase, but the improval in overall image fidelity which is the most stark difference between the current method and DLSS 3.5.
While ray tracing tracks a huge number of rays firing all over a game scene in real-time, there’s no possible way right now that we have the GPU power to be able to have a 1:1 ratio of rays to pixels. That’s where the denoisers come in, taking all those little dots of light, those individual rays, and blends lighting information between them to fill a scene so it doesn’t look so damned noisy.
But they’re pretty dumb, and use heuristics to make the decisions on how to smudge the dots, and while the parameters are hand-tuned they can easily lose a whole lot of information and deliver a comparatively blurred final result.
It looks better than a noisy scene, especially when it’s in motion, but it can also introduce visual artefacts, or ‘boiling’ images where the denoisers are changing their minds on what a particular low-image-data area of a scene ought to look like.
The AI-based Ray Reconstruction, however, is designed to recognise patterns within a noisy image, using temporal data from multiple frames, to be able to better deliver a consistently clear image.
Inevitably, the example Nvidia has given is of Cyberpunk 2077, from the upcoming Phantom Liberty DLC. That’s going to be one of the first few DLSS 3.5 games in September, with Portal with RTX and Alan Wake 2 following later on. But the difference in fidelity is obvious from the demo Nvidia has given, with the lighting we’ve been used to, even on the path traced Overdrive mode, looking faintly ridiculous by comparison.
It’s most obvious in motion, where the denoisers would previously smooth out a huge amount of the lighting data as you move through a scene. In one scene the camera moves beneath a series of changing, differently coloured lights. Without Ray Reconstruction the reflected light on the floor has a muted reproduction of the colours above as they cycle through them, while the extra temporal data and AI reconstruction of DLSS 3.5 accurately maps the colours and creates a more distinct, vibrant, more realistic effect.
As we mentioned before, the performance increase comes purely from removing the denoiser stage from the pipeline, and is going to be game dependent. Path tracing is likely to see the biggest benefit in terms of a performance increase, while more standard hybrid real time ray tracing might see a limited improvement. It’s worth noting, however, that where Frame Generation can improve frame rates but at a cost to latency, that’s not an issue for Ray Reconstruction.
The only impact you’ll see on latency from this new technology is an increase commensurate with any frame rate gain. Which basically makes it one of those no-brainer technologies that you’re going to want to have turned on wherever you see the checkbox.
Though, admittedly, that’s not going to be too many games at launch. And it’s not going to be a simple case of dragging and dropping new DLSS 3.5 .dll files into existing games; there’s a change in the pipeline because of that removed denoiser stage, and that takes some per-game work. But I’m sure that won’t stop people trying…