Did some tests today, on my GTX 1070. And found Optix working faster than CUDA in one scene… It seems Optix has gained speed and it close to CUDA now on GTX class card at least. im sure its way ahead on an RTX.
I also noticed i could set the time limit for a render and use denoise, and then run CUDA and Optix and the visual difference between the two is extremely negligible.

Hey does anybody also experience system freeze on windows 11 when using open image denoising in blender?

Edit: actually I tested around and found out it only crashes when the new prefilter option is set to „accurate“

Hi and no but I use always the latest version of Blender 3.1.
What do you use?

Cheers, mib

I am not seeing the bug here, it might be an issue that came with your decision to be an early adopter of Windows 11.

Unless you have an Alder Lake chip preordered, Windows 11 currently does not provide any kind of advantage compared to Windows 10, I would’ve waited until it has received at least one or two major updates.

Any news about MaterialX implementation on CyclesX + Blender 3 ?

1 Like

I also use always the latest daily. 2.93 doesnt freeze it. although the open denoiser is denoising tile by tile in comparison to the cycles x denoiser.

Has there been any mention of the status of multi-light sampling/path guiding?

looks like no

Hi, is it possible you running out of memory?
You can also use tiled render on 3.1, it is implemented to save memory.
Daily is on halt for Windows at moment, I will check tomorrow and test again.

Cheers, mib

Thanks for the links. Fingers crossed we see some progress soon

It really is the biggest downside for Cycles X… Anything with more than say 2 or 3 lights just struggles for us!

1 Like

It’s caustics for me, it’s a glaring gap in Cycles’ capabilities that needs to be filled!


Hi, I test again with blender-3.1.0-alpha+master.3364a5bea6c9-windows.amd64-release and cant reproduce a freeze.

|Prozessor|Intel(R) Core™ i5-8350U CPU @ 1.70GHz 1.90 GHz|
|Installierter RAM|8,00 GB (7,38 GB verwendbar)|
|Systemtyp|64-Bit-Betriebssystem, x64-basierter Prozessor|

Cheers, mib

Light bleed issue.

I have a room where each wall and the roof are 3 separate planes. This wasn’t a problem in old Cycles but in new Cycles there is light sneaking between the edges.


I have to join the objects, add cuts where necessary so all verts have a partner on adjacent faces and then weld their verts together to make it go away.


I thought maybe the original planes did not have accurately placed verts but after welding them together and making it all air-tight it should be safe to split them into separate objects again. Nope, the light leak problem occurs again.

1 Like

Just a shot in the dark here but can you try changing these values for the room object(s)?


I’ve seen similar stuff happening in Redshift and Arnold and this was the cause. Apart from that it’s always a good idea to model walls watertight and / or give them a thickness instead of a single sided polygon.

I’ve got this scene that’s been plaguing me for a couple years. In 2.8x and 2.9x the viewport render with denoising was flawless but the final render had horrible noise in an area that did not make sense and the OIDN only made it worse.

In 3.0 without changing any settings the final render is now flawless. However if I increase the strength of the HDRI just enough so that the outdoor lighting is overpowering the indoor lighting a little, that same problem returns ten-fold. Has anyone else ever experienced in any version of blender where the viewport renders fine and the final render has horrible noise problems made worse by denoising?

The one important odd thing about me scene is I was following this lighting advice:
as in I added a point light that was set to the real world strength of a candle flame and then adjusted the camera exposure relative to that light (around 7 or 8) and then setup the rest of the lighting. It always rendered fine in viewport but horrible in final render. Now in cyclesX the same settings render fine in both however a small increase in the HDRI strength brings the problem back with a vengeance.

notice how the noise threshold is the same for both Viewport and Render

Now if I set Render Noise Threshold to the default value of 0.01 I get this mess:

1 Like

If i had that problem my solution would be, render it out with no denoiser, then send the final result to an external software for post processing, for me that would be AI gigapixel as it seems to work as both an AI denoiser and upscaler at the same time and you have 4 different AI models to chose from and strength of denoising slider.

Congrats to developers. With the Beta i now can do the BMW27 scene in 12.88 sec. With Alpha i did it in 15.xx seconds. 130w RTX3060 in a laptop.

Neverthless i noticed when i add the 5800H cpu to the Optix System preference the render gets slower instead of getting faster despite adding another device to help render. 13.80 seconds.

I think this should be looked after.

Did you try to give thickness to walls ?

Sounds like a bottleneck. I wonder what would the results be on say a next generation AMD zen4 with 3D cache and DDR5. It might then be able to keep up with the insanely fast GPU… But im only guessing i dont know the inner workings of the code or anything.