DeepDenoiser for Cycles


#1

Due to some technical changes, please don’t create new renders for now!
More details are in this post.

The DeepDenoiser is a denoiser for Cycles which uses deep learning. To make it very clear from the very beginning: This work is NOT FINISHED!

I have been working on this project for a few months and there is slow, but steady progress. You can find the code on Github: https://github.com/DeepBlender/DeepDenoiser
For deep learning, you usually need a lot of data. Up until now, I have rendered about 60 images. This has been very time consuming. The quality standard has to be very high because the final images have to be as noise free as possible without any fireflies. This is currently slowing me down the most and that’s why I though to ask the BlenderArtists community to help me out. If you have some unused rendering power, your help would be very appreciated!
The code is already open source and everything that is required to train the neural network is going to be open content, such that it can be shared without any legal issues.

Requirements:

  • Use Blender 2.79b
  • Only open content!

Here is a short guide:

Seed:
Just leave it at 0.

Resolution:
You can pick any number for X and Y, as long as it is a multiple of 64.

Folder:
The directory in which the folders with the EXRs are going to be stored.

Samples:
Make sure to have two entries as shown in the screenshot and don’t forget to enable the checkboxes. Everything has to be as shown, except for the second ‘Samples’ which is 1024 in the screenshot. This is the number of samples per pixel for the noise free image. You have to make sure that this one does not contain visible noise or fireflies.

Render:
When you hit the render button, the scene is going to be rendered 8 times with 16 samples per pixel (with different seeds) and once with 1024 samples per pixel or the number you have chosen. For each rendering, a new folder is created with multiple EXRs.

WARNING:
Once you hit ‘Render’, the script is going to modify several render settings and the compositing. The script disables all sorts of rendering tricks, like clamping or the the denoiser. To check whether the rendered result is not noisy, open the EXR which contains ‘Composed’ in the name.

Please create a zip file with the results and send me a download link.
The zip file has to contain the blend file and all additional files like the textures, all the rendering folders and a txt file with some copyright information (Did you create the model? Where did you get the textures?). It has to be possible to directly reproduce the results, by opening the Blender file, so make sure that the paths to the textures are correct out of the box. It is very important that everything is open content! This restriction excludes e.g. any texture from textures.com.

All help is highly appreciated! Don’t hesitate to ask questions or to point out issues. I am going to share more information about the project in the coming weeks, depending on how much time I am going to have.


How long until A.I. de-noising in Blender?
Does realtime raytracing make Eevee obsolete?
Optix AI Denoiser
(English is not my native language) #2

Have you tried the script in Linux? When I run it, only Seed and Resolution items are created.


#3

Thanks a lot for the feedback. It is only tested on Windows (shame on me). I can test it on a Mac later today, hopefully the problem shows up too.
Did you get any error messages?


(English is not my native language) #4

Wait. The problem is scene related, items are created in default cube scene:
http://pasteall.org/blend/index.php?id=49461


#5

Thanks for the reproduction! The issue is now solved, please download the script again.
The world settings were missing. You now get a warning and a button to automatically resolve the issue.


(English is not my native language) #6

Hi.
I do not have good upload bandwidth in my internet connection to upload the generated files. Anyway, the scene that I shared before is the one I use to test denoisers and how good they are at preserving textures. Here is the scene with the information of where I got the materials:

Edit:
I modified and upload the scene again because I have noticed that there was some terminator problem with the spheres.
Edit 2:
I had made another mistake when save the file with blender from master (instead 2.79b), which modifies material if it has something connected to displacement. Here the final file (I hope):
http://pasteall.org/blend/index.php?id=49470


#7

Thanks for the scene!
If this is the scene you are using to test denoisers, then I should not use it! When I train the neural network with this scene, it is going to learn this particular scene very well and is going to perform well and as such making it useless for you as a test scene. Do you still want to contribute it?


(English is not my native language) #8

Yes, no problem. I used the scene to make tests when Cycles denoiser was being developed. You can use the scene if it is useful for you.
Anyway you were requesting help with power of rendering, sorry for not being able to help with it because of my bad internet connection.


(burnin) #9

@DeepBlender
Hi, check the posts here (cycles denoising AI experiment), if it is of help i’ll re-load the images and send you the link.


#10

No problem. Thanks for the contribution!


#11

Thanks for the offer, I really appreciate it! Unfortunately, the requirements for this project are very specific. There is no way around using the script I provided for the rendering. This is the only way to create consistent results which allow to experiment in different directions. I am currently improving the first post in this thread to be more informative as well as adding some descriptions to the GitHub repository. Hopefully, those are going to make it clear why those restrictions have to be there.


#12

I have added a readme file to the GitHub repository to make it clearer what the project is about. Someone gave the feedback to do that.
The next goal is to create some actual visual results that I can present here to give everyone a better idea what this is actually all about.


(burnin) #13

Have you checked this https://declanrussell.com/portfolio/nvidia-ai-denoiser/
Had tested and it works pretty damn nice on unidirectional PT imagery (Cycles, Lux PT). But it struggles or does nothing with BiDir generated images (Maxwell, Indigo).

So for your case only Cycles should/must be used? Would be nice if it would be made to work with other engines…


#14

I am aware of Nvidia’s OptiX. It is one of the directions I am experimenting with. Replicating this kind of results is very high on the list and that’s one of the reasons why I am asking for help.

That would indeed be nice. For a project which is just getting started, it would be naive to have such a huge scope from the beginning. The first milestone is to get a solution that is know to work with Cycles, be it for preview purposes or more towards production quality.
It is using the Cycles passes, but it could easily be modified to use any kinds of passes/AOVs, be it as input or output. So theoretically, It could be adapted to other engines. This would require lots of renderings from those engines to train the neural network for this purpose. This exceeds the scope of this project and is not planned at this point.


(Photox) #15

The script doesn’t seem to do anything on linux.


#16

Could you share the blend file, such that I can reproduce and resolve the issue?


(English is not my native language) #17

Hello.
Some problem you may be having. With some themes, those two checkboxes on the left in Samples field are not very visible. They must be marked.
When you hit Render button, no status indication is displayed. You will only notice that your system is somewhat slow because it is rendering. You must check that the 9 folders with their content have been created in the location you have chosen. You can also monitor GPU work with the following command from the linux terminal:
watch -n 0.5 nvidia-smi


(Photox) #18

Linux Mint, Blender 2.78.5

blend on dropbox

The scene uses a blindingly bright sun and emission plane, and produces a lot of noise and fireflies, and I had it set to something like 2 or 3k samples. It produces the raw raw render for my weekly wec entry. There are no image textures in this blend, and all objects etc… are 100% my own art. Released CC 0.


#19

That was quick!

You weren’t kidding about it being easy to implement.


#20

That looks amazing!

You haven’t added any render jobs to the list. It should look something like this:

As mentioned in the first post, the first entry should be “Samples: 16” and “Renders: 8”. It renders 8 times with different seeds using 16 samples per pixel. Those are the noisy inputs.
The second entry should be “Renders: 1”. “Samples: 512” is just an example. This is the samples per pixel which create a noise free render. I hope this helps.