Denoising with OpenImageDenoise per tile instead of full image

Hello! I am new in Blender, I am so old, I formerly used DBWRender, TurboSilver, Imagine on Amiga… :wink:

So here is my newbie question:
Denoiser: OpenImageDenoise - tile size for example 512 - Blender 3.0.1

Is there a way to tell Blender to denoise some rendered tiles while the GPU is rendering? The 2.9 versions did it that way but since Blender 3 it denoises the whole picture after it is finished and my CPU will get over 100° after the GPU (3090) put a lot of heat into the system while rendering.

Would be great if someone could give me an advice. Thank you very much, have a nice day.


First off, it is not normal for the temperature of any machine to exceed the boiling point of water unless there is some overclocking going on (at least if you have proper cooling). You can actually do a few things to reduce hardware use by reducing the number of threads Cycles is allowed to use or switching to the power save setting in Windows’ Settings app.

If you care about the longevity of your hardware though, then please see it as an obligation to get the beefiest cooling system money can buy if an RTX 3090 is used (because the TDP of that thing is among the highest ever seen for PC hardware). If you don’t have that cooling system in place, then I advise avoiding a purchase of the upcoming RTX 3090 Ti unless you want to be running a fire hazard.

Thank you for taking the time to answer Ace_Dragpon!

Yws, there is overclocking and the CPU (i 6700K Skylake @ 4,4 GHz) is a very hot running CPU but normally it runs very ok under load. Cinebench and games etc. is no issue.

It’s only that spike while denoising when all cores are fully loaded and the GPU (3090 at 2 GHz stock, not overclocked) had heated up the system. I have very beefy cooling and a big tower with lots of air and Noctua CPU cooler and 3 added Noctua fans for additional airflow.

Restricting Blender to let’s say only 4 threads instead of 8 works somewhat (90°) but is not what I really prefer.

That’s why I want to know how I get the past behaviour of Blender back, when it denoised only some tiles while the GPU was happily rendering the next ones. Maybe a setting in preferences or the denoise setting I don’t know?

Hope someone has a clue, thank you very much for taking the time to help me, Ace_Dragon!

I’d suggest checking the thermal paste on your cpu, I got the same, clocked to 4,5ghz and it runs around 80°c max under full load with just some basic artic cooler, paired with 3060 Ti that doesn’t exceed 70°c. And those new GPUs should never run above 85 imo

There is also a free addon called Super Image Denoiser, which is a much better node based denoiser, I’m sure there is a tutorial on how to use it somewhere on YouTube

1 Like

i may be wrong here but if you use optix as a de-noiser isn’t that gpu based? (as you are running a 3090 i’d expect optix to be the thing!)
open-image denoiser is cpu

Entirely user-generated problem. Lack of care for the hardware. Not enough or bad cooling or bad case or old thermal interface or any combination of them. Not related to Blender, Blender version, Denoiser version or basically any software thing.

Is there a particular reason why you are rendering with tiles in Blender 3 and above?