DeepDenoiser for Cycles

Thanks for the kind words.

Moiré effects are certainly very difficult and I haven’t tested them at all so far. If you or someone else sends me an example with high frequency details which I can just render (and denoise), I will give it a try and post the results here. I would be surprised if the results were as good as the ones I have previously posted.

1 Like

Thanks to @Photox for providing the scene and combining the renders together into those images!

WARNING
This is the first time I am publicly showing a direct comparison between the DeepDenoiser and Cycles’ denoiser. I want everyone to be aware of the fact that this is an unfair and not a representative comparison! It only shows the denoisers on one kind of image with the default settings. I am confident, that the results for Cycles’ denoiser could be improved! This comparison also ignores the performance and all sorts of other factors which are essential for accurate representations!



14 Likes

What is the memory footprint for deepdenoiser and how is the performance?
Do you think it could run in viewport/realtime similar to optix?
Would it work with very high resolutions? Like 10K x 10K pixels?
Lastly, is there a way to test this on my own?

As far as I know Cycles doesn’t do any adaptive sampling, so all pixels should have exactly the same amount of samples.

@DeepBlender I think it would be good to also show the ground truth (a noise-free result without denoising).

I can’t give you precise numbers on that one yet.
The memory footprint is definitely significant. All passes/AOVs are split into tiles and are denoised separately. Without that, it wouldn’t be possible for me to denoise on my GTX 850M due to memory restrictions. On the CPU, it would work, but it is considerably slower.
All renders I have posted in this whole thread took somewhere between 1 and 3 minutes to denoise. This includes some performance overheads I have to get rid of. I can’t give exact numbers, because I am still running everything in a debug like mode. I am working on an optimized version though which doesn’t contain ugly overheads.

It could definitely be used in the viewport, but I have no idea yet how fast it would be. Besides the performance overhead, I also haven’t optimized it yet.

Yes. As mentioned, it automatically tiles the passes/AOVs and because of this, there are no restrictions for high resolutions.

I am working on that. It is really painful to use it right now. That’s why I am simplifying it, which most likely has the benefit that I am going to get rid of some of the overheads as well.

4 Likes

Thanks for pointing that out. I am going to start the rendering right now and will add it to the post.

1 Like

Looks great! The blender denoiser, at 16 samples, looks like an iguana took a dump in the coffee, whereas the DeepDenoiser is shockingly clean.

edit[]

Thanks for helping me out with that!

1 Like

The latest coffee renders now have a 10,000 sample converged ground truth shown with them. 6 posts up.

Are all those images rendered with the same settings/color grading?
There are some sprinkles that only seem to appear in the 10,000 sample images,
and more noticeably,
The noisy and denoised images all seem to have Default color grading while the 10,000 sample image has Filmic color grading.

Is this the case, or do the pixels actually get darker on the whipped cream as you raise the sample count?

2 Likes

Uh, oh. The 10,000 sample image was done on my computer. I had packed the file and sent it to deepblender, who rendered the other ones. Perhaps deepblender should render a 10,000 sample raw image on his system to match the others, I can update the combined reference images.

I am on it :slight_smile:

@lolwel21, thanks for pointing that out!

Stunning results, thank you for your work. The shadow at 16spp looks extrem good, I think the artificial thing shows here it´s intelligence!?

2 Likes

Man, even with all the caveats and unpolished code, that is a stunning result.

I’ve gotta admit, I was doubtful of the efficacy of an AI based approach, especially for a one man programming team. But you have really pulled it off well.

With results like these, you could probably drum up enough excitement to get a little Patreon going. I’d give you $5 a month to keep working on that, and I doubt I’m alone…

2 Likes

Damn … this is stunning.
Your denoiser must have trained Dragon Ball style to get this good!

2 Likes

Hi, DeepBlender thank you for your work!
is there are estimated time when your denoiser will be available for public use?

p.s. example of denoise in active shade

3 Likes

Great work.

1 Like

Thanks to everyone for the positive feedback!

You could easily create a clickbait title out of this sentence :wink:

I thought about that before, but wasn’t sure whether it makes sense. I guess I need to reevaluate it.

It is definitely an adventure :smiley:

No. It is a spare time project on which I am spending a huge amount of time. I can’t make any promises.

1 Like

I would say either patreon or an add-on for blender market. (If you want to try and get some money out of it.)

An add-on would be painful to implement and likely not as performant and user friendly.

2 Likes