Can't use denoising filter.

I recently heard about the new denoising option in the last experimental blender version release so i went straight ahead and downloaded it. The first time i tried to use it, blender gave me an error and said that there wasn’t enough Cuda memory or something. Since my GTX 570 only has 1280mb of memory it made sense to me… I was just wondering if there is anything i could do to still be using the denoising filter with this graphics card…

By the way, this is my first post here so I wasn’t sure where to post this. If this isn’t the correct place please tell me where is. Thanks

Only thing I can suggest is smaller tiles. I was running out of memory on my 4GB cards at 512x512 but was able to denoise at 256x256. It needs memory for the denoising on top of what is being used by the scene.

it is new and development , but i like it . switch to cpu render , it is slower i know but you still don`t give it an try what i read :smiley: you can use some filter curves to reach the same if u have to less memory. my english is really bad , sorry

It is but you should try! It works in all latest but timings strange on latest.
Branch?:thinking:

Hi.
This is an old thread. Nvidia 5xx series is not even supported in current experimental/non stable builds.
For 2.79b, here are some things explained (like when the GPU runs out of memory):
https://docs.blender.org/manual/en/dev/render/cycles/settings/scene/render_layers/denoising.html