DeepDenoiser for Cycles

My goal is that there is only one setting, which would be called something like “Preserve Details”. If the denoiser would have some uncertainty how to denoise some pixels, you could be able to decide on those whether they should be blurred or if you prefer to have some noise. Have a look at the video on this page (“Preserve Details” would be what they call “Lambda”):
http://drz.disneyresearch.com/~jnovak/publications/KPAL/index.html

It would be ideal to have the denoiser as a node in the compositor. This would give the user most freedom and give access to all passes and the user might even choose whether the noisy or denoised passes should be used.

1 Like

A node would be great. One could create a mask to determine where to blend in the denoised render.

1 Like

This scene basically crashes blender and the camera is pulled way, even when I render it’s not looking right. Do you have another scene that isn’t as complicated and needs supporting directories?

Let’s pick one from BlendSwap. Let me know before you start to render it, because I used some scenes for the training which would result in an unfair comparison.

it doesn’t matter much unless you try it on an anime scene …

If you can point me to such a scene for Cycles, that would be great. I am curious to see how well or not it works :slight_smile:

Okay I’ll look for one I’m out for a bit now. For the record I’m not trying to show that your denoiser are is not as good in fact I think that just the opposite is true but I think it’s an interesting and valid comparison.

1 Like

Those results are seriously impressive!

2 Likes

Thanks a lot!

1 Like

amazing work

1 Like

Now that will make some noise.
pun intended.

1 Like

On point :smiley:

Be assured that I didn’t get the impression you want to rip this project appart.

I am absolutely certain that there are many cases where the DeepDenoiser performs worse. Right now, I don’t know what exactly those weaknesses are. The earlier I am aware of them, the sooner I can look out for those kinds of cases to improve them. That’s why I welcome those early experiments.

Good, you know me, I’m super pro AI, but also a stickler for information! I have a couple scenes of my own, both won the weekend challenge that I thought I might like to compare. But the more I start thinking about how to compare them, the more complicated it gets, because it’s not just quality, it’s time and quality. And I think the time metric is one that need to be included in any comparison, but of course that depends on your system too…

1 Like

Right, time is important! While a minute of denoising per frame seems long for motion graphics shots for example its basically negligible for a VFX shot where rendertimes can be several hours per frame.
So the importance of time depends of the field of use i’d say.

1 Like

Your work is simply amazing! I wasn’t felling that excited with blender denoising progress for so long. The quality of restoration is superb.

I wonder if the algo knows somehow a pixel is “empty” (not sampled yet) and try to recreate it. Because the result is so good and areas with a high sample count didn’t seem to be negatively affected by the effect, while areas with a low sample count are hugelly afected in a good way.

About discovering the weakness, I think you could try something with moire patterns like


Image from here. I think they usually helps to show problems with denoising algorithms in general.

Moiré is not a denoising problem but an artifact of using too little output resolution to capture all input details - undersampling in the spatial domain if you like. It’s like trying to capture sound above a recording’s nyquist frequency. All you can do is lowpassing (=blurring) the pixels.

That’s exactly the reason why moiré may be a good way to see denoising issues. I’m not saying moiré is a denoising issue, just that scenes with [objects that cause] moiré are rich in high frequency details, and need to be rendered with a high sample count to get good results. So this denoising algo could have difficulty handling scenes like this with a low sample count.

The moire effects aren’t from not having enough samples though, they are from not having enough pixels.

Moire issues would be more important in an upscaler than a denoiser.

You can render a scene with [objects that can produces] moiré at any resolution and get a moiré free result. That just depends on the [over]sample count.

I see what you’re saying. In the old days of Blender internal we needed to render a image with 4, 8, 16 samples per Pixel (AA) to reduce moiré artifacts, and that’s like rendering a higher resolution image and dowsizing it back. Now in Cycles we have just samples… But that’s off topic.

I’m just suggesting to try scenes with high frequency details with the new algo. It doesn’t need to be moiré… :-\

3 Likes