How far away are we from a real time denoiser for Cycle preview?

been playing Quake 2 RTX lately, the denoiser works very well, I’ve been wondering how difficult to bring a real time denoiser to Cycle for preview purpose?

Not long at all:

Define “real-time”? With the Optix viewport denoising it takes a few seconds for the image to clear up, for a metalic surface. Hair and SSS takes abit longer. It is super useful for for hardsurface modeling, E-cycles RTX already has viewport denoising.

As far as real-time like Quake 2 RTX , that’s probably 3-5 years away. We need better denoising and more tensor cores. This year we should see the RTX 3000 series, more tensor cores along other things. Maybe even 20GB on the Ti version.

Well real time denoising like in quake is possible we don’t need any better hardware at all as you can see… by playing quake

We just need a more real time method to be added, The denoisers in blender arent the same as the real time one in quake which is why they dont work in real time

Still 3-5 years away for the software(denoising) and hardware to get real-time denoising on a complex Cycle scene. Maybe we’ll see it first in a vulkan version of EEVEE.

Quake isn’t really a good example, because its OLD game, you can kind of do real-time denoising looking at UE4 but it’s still very slow, takes a few seconds to converge on a noise free image.

The video game Control has denoising along with real-time GI, it’s still has slight noise during game play.

can someone clarify a bit on de-noise process, isn’t it just a post-processing so it has nothing to do with the complexity of the scene itself or it is actually a part of rendering process (that in a sense it will use some data from the rendering process to finish the denoising)?

I mean, that’s cool. No one expects or even wants perfection, they just want a decent preview. You’re not gonna be doing final renders with the hypothetical viewport denoiser.

Is there any reason compositor nodes cannot be executed on the viewport at the end of the preview render run.

The compositor denoiser is very fast.

True but EEVEE is useful for animation and still, couple that with real-time ray tracing and denoise you open up new doors for working. I guess it depends on what you mean by final frame rendering, people are using Unreal Engine 4 for rendering tv shows before ray tracing was introduce. I think we’ll see real-time denoiser in the future like 3-5 years once the hardware is faster and the software was many years of training and improvements.

That’s the thing. The moment you do an offline render, even with eevee, that means you have temporal information both ways. Not just the current frame and the previous frame, but the next frame too. This would have a profound effect on the effectiveness (lol) on current cutting edge TAA techniques.

Also, if you’re dumping frames you’re not wanting anything in real time anyway. For a realtime solution one could take an opengl compatible shader and plop it in eevee and we’d be done.