Cycles noise reduction with VapourSynth

Thank you to you two, </crappy_english>, I understand now. :smiley:

Cheers, mib

Hello all
I just test MonoS’s script and it’s works great!
http://forum.doom9.org/showthread.php?p=1736964#post1736964
For YAFU
This is new script
https://bpaste.net/show/d2a8f5d2e6f8
This is your sequence
https://dl.dropboxusercontent.com/u/34973756/distfiles/IMAGES.mp4
I think it is better!
ps: use gentoo!))) is is much simpler than ubuntu

looks cool but cba to install.

Hi brothermechanic.
There in doom9 forum the people are friendly and they have shown some alternatives to avoid banding. The problem in the original script that you have posted is a component does not work with 16-bit color depth, so you can not use 16-bit PNG. I think you can get a similar result to those you show if eliminated this line in your original script, even using jpg:

ret = core.fft3dfilter.FFT3DFilter(ret,sigma=2.5, bt=5, bw=32, bh=32, ow=16, oh=16, sharpen=0.7)

The problem with some of those scripts that have shown me in the forum is that some of them blur the image a bit.
I’m doing several tests to determine which is most suitable for blender.

Duplicated message. Sorry

fft3dfilter i used for HARD denoise process, this is destructive filter and i dont recomend to use it for normal source files.
Now i done some work around and get 3 better way to do noise reduction

  1. comment fft3dfilter line from my script http://www.pasteall.org/60466
    This script contain super cool mvtools DEGRAIN filter and DO NOT blured image!
    But it can’t remove hard noise grain. https://dl.dropboxusercontent.com/u/34973756/distfiles/1.mp4

  2. use modified script from MonoS https://bpaste.net/show/48e6b3e2ad7d
    This script do not add banding effect
    But as you can see it have fome artifacts https://dl.dropboxusercontent.com/u/34973756/distfiles/IMAGES.mp4

  3. use KNLMeansCL filter https://bpaste.net/show/d58ba8761cf6
    result https://dl.dropboxusercontent.com/u/34973756/distfiles/KNLMeansCL.mp4
    there is some grain

And i try to combine 2 and 3 and add image png8bit output https://bpaste.net/show/51ec0b1e4059
(video file from pngs) https://dl.dropboxusercontent.com/u/34973756/distfiles/IMG.MOV
not perfect…

wip

Ok
The problem is in MonoS’s script

superF1 = core.generic.GBlur(input, 1.0)

This is prefilter for mvtools superclip
And… It is baad)))
The fft3dfilter much better for it (as i found that) But we need to use 8 bit
this is my latest scripts and resoults
orig seq https://dl.dropboxusercontent.com/u/34973756/distfiles/orig.mp4
old banding https://dl.dropboxusercontent.com/u/34973756/distfiles/old_result.mkv

1-low-denoise - better way to filter images
https://dl.dropboxusercontent.com/u/34973756/distfiles/1-low-denoise.vpy
https://dl.dropboxusercontent.com/u/34973756/distfiles/1-low-denoise.mp4

1-med-denoise - if 1-low-denoise do not give good results
https://dl.dropboxusercontent.com/u/34973756/distfiles/1-med-denoise.vpy
https://dl.dropboxusercontent.com/u/34973756/distfiles/1-med-denoise.mp4

1-hard-denoise - for mad users)))!!!
https://dl.dropboxusercontent.com/u/34973756/distfiles/1-hard-denoise.vpy
https://dl.dropboxusercontent.com/u/34973756/distfiles/1-hard-denoise.mp4

That’s it
Hope it helps!

Very nice brothermechanic!
Yeah, I saw that Fft3d does an excellent job. It is a pity that it only works in 8 bits. There they had recommended me use DFTTest instead, but I think it is more destructive. I’ve done some tests with:

superF1 = core.dfttest.DFTTest(input, sigma=200)

But I do not really know what parameters are used in DFTTest, I’ve only used sigma.

Another issue, do you know if with the imagemagick plugin would be possible to choose which layers to apply noise reduction on a multilayer EXR file?

That would be great be able to choose which layers to preserve detail and which apply noise reduction. Even be able to apply different noise reduction levels in each layer.
I clarify that I never used multiple layers EXR, I have only read about it

Here some tests with the hard-denoise script and a modified BMW scene. Note that it is 50 samples only!
As it is 50 samples only, need some trick. In Blender with compositing I applied very little Bilateral Blur to the car windows, and despeckle in the ground, and the result is exactly what you see in the first video “Original_50sampes.mp4” (yes, there is a typo :slight_smile: ). That is why the noise reduction in the headlights not had such a good result in comparison with windows:


Amazing! :slight_smile:

WOW!
I wonder if the algorithms/technology is applicable directly into the VSE!

Hello
Latest test of denoising
This is video from orig sequence https://dl.dropboxusercontent.com/u/34973756/distfiles/camera1-orig.mp4
This is best (but not perfect) that i can to write https://dl.dropboxusercontent.com/u/34973756/distfiles/camera1-04.vpy
This is result https://dl.dropboxusercontent.com/u/34973756/distfiles/camera1-filter-04.mp4
Can you try to find best way?

Really amazing! I’m interested in the BMW test because everything was denoised except the headlights that kept flickering. I have similar experience with MagicBullet Denoiser where it worked perfectly on all but some refractive surfaces. Is it possible to sort this aswell?

@brothermechanic: How these two lines works?

ret = core.fmtc.matrix (ret, mat="709", col_fam=vs.RGB, bits=8)

ret = core.imwri.Write(ret, "PNG48", "camera1-filter/images-%04d.png", firstnum=1, quality=100)

They are temporary images for the script can work, or it is just to set the output images?
So, I can apply “vspipe script.vpy -” from the terminal and then generated the video from the obtained PNGs in the path indicated on second line?

@cgstrive, there the material of glass and chrome create too much noise for only 50 samples. Perhaps applying some bilateral blur as I did with windows, the noise reduction works little better. Another solution could be separate objects with difficult materials to another render layer and configure more sampels only in that render layer.

Comparing the new script with hard-denoise script from post #47. Using png 8bit.

Here original videos:

It seems that the new script is a little better preserving details/edges and faster, but hard-denoise script appears to be slightly stronger noise reduction (see car windows):
*hard-denooise script:

*new script:

The banding with the new script seems to be little more noticeable than with the hard-denoise script:
*hard-denooise script:

*new script:

Regarding the color banding, I tested KNLMeansCL only with 16bit PNGs, and if I use a strong denoise parameters the banding also appears. So I do not know exactly why/when this happens (anyway I’m not sure if I correctly loaded the 16-bit PNGs from the script when I used KNLMeansCL only)

Hi.
As brothermechanic said, you can use VapourSynth from Gentoo Linux.

For users of Ubuntu and Ubuntu family or derivatives, “djcj” (PPA maitainer) has already solved the problem with “imwri” and missing filters in the PPA. Thanks djcj!
https://launchpad.net/~djcj/+archive/ubuntu/vapoursynth

sudo add-apt-repository ppa:djcj/vapoursynth
sudo apt-get install vspipe vapoursynth-extra-plugins vapoursynth-editor

it is recommended that you also install “ffmpeg” and “libx264-xxx” (xxx is the version depending of your distro).

I have noticed some imperfections in the video with “hard-denoise.vpy” and it is not recommended. For hard denoise you use the new script in post #51.

When you see in the script a line like:

src = core.imwri.Read('camera1/%04d.jpg', firstnum=1, alpha=False)

You remember there replace with the path where you have your images. For example, if you saved your images from blender as “png” in “/home/YOUR_USER/render images”, the above line should be:

src = core.imwri.Read('/home/YOUR_USER/render images/%04d.png', firstnum=1, alpha=False)

From the terminal opened where I have the script, to get the video I am using the command:

vspipe --y4m script.vpy - | ffmpeg -i pipe: -vcodec libx264 -crf 10 encoded.mp4

This creates a video at 30 fps (later will say how to change the frame rate). “crf 10” is the video quality. Lower value, higher quality and size (see ffmpeg manual).
In the script in post #51 you need to do some modifications to get the output with the above command, otherwise you need to configure the output path in one line near the end of the script.

Better run some test with textured (hi res and detailed) objects guys. While those filters might seem ok for simple scenarios, it’s totally different story with highly detailed images. Good test could be something like detailed human skin (with pores), grass and foliage at some distance etc.

God I wish I understood anything that is being discussed in this thread. Would love to try this out.

@Rexono, Which is your operating system?
I have no idea how all this works on Windows, but if you use Windows, install the program:
http://www.vapoursynth.com/doc/installation.html

Install VapourSynth editor:
http://forum.doom9.org/showthread.php?p=1688477

Then you must install the plugins and filters. The plugins in Windows are DLLs apparently, which should be installed/copied on the VapourSynth plugins path:
http://www.vapoursynth.com/doc/autoloading.html

I think also it is possible to read plugins from the script with “core.std.LoadPlugin”. You can continue investigating by reading the wiki:
http://www.vapoursynth.com/doc/installation.html
http://www.vapoursynth.com/doc/pluginlist.html

Plugins required are:
imwri(supposedly included by default in VapourSynth installation), fmtconv (fmtc), fft3dfilter, MVTools.
optional: ffms2, KNLMeansCL.

and perhaps someone else I can not remember. Anyway, if you analyze the script with VapourSynth editor (vsedit), it will tell you about missing plugins.

Anyway, all this seems much easier from Linux.

Hmm is this script exclusive to animated noise? I tried it on some scenes rendered with a static seed and it creates “patches” that remain the same, while others are overly blurred. It gets even more smudged on areas where vector blur was used because they don’t follow the overall noise pattern and, in a way, get blurred twice. YAFU: did you use any type of motion blur on the falling cubes/balls scene?

Yes, it is to be applied in animation. In the first messages in this thread I was testing using only one image, but because I had no idea what I was doing and how it worked. But if you think about how this works, you will notice that the filters analyzed more than one frame at a time and they take advantage of the different noise patterns in Cycles. You need set random/different noise patterns in Blender/Cycles, I use #frame in seed field before Blender 2.75, or button to the right of “seed” field from Blender 2.75.
That is why this method of noise reduction blurs the image much less compared to other methods of noise reduction for single images.

I have not used compositing nor any tweak settings in that scene. It is a very simple scene. Note that the cubes material need a lot of samples to appear without noise, and it is setting 30 samples only. This is the scene that I modified by putting Checker Texture to the spheres to better detect distortions at the edges of the video (as happens with “hard-denoise.vpy” in post #47). Here it is:
http://www.pasteall.org/blend/38094

The simulation in my videos are made with Blender 2.75a. So, if you do testing with this scene, you try using Blender 2.75a (I’ve seen this scene in 2.73a generates less noise)