CPU tiles with GPU tiles - unusable difference

Hi guys,

I’m working on y big project and the mix of CPU and GPU tiles is a total desaster. Looks like a blocky jpg.
Do I need special settings to render properly or do I just have to accept way longer render times to get an even result, rendering CPU only?
How is it actually even possible that code is being treated differently across devices? Or is it really just a setting after all?

Show an image of the issue. There are no problems with hybrid tiled rendering, you must be having an isolated issue with your scene setup.

There have been bugs and issues in the past with GPU+CPU rendering related to denoising I think. Basically tiles rendered on one device being denoised by the other and blocky square tiled sized areas looking weird.

Having said that… If I had a problem with rendering on GPU+CPU my solution would be to use GPU only… not CPU only.


Here’s a snapshot of the issue - check the dark surfaces. It happens in the interior as well, especially on dark, reflective surfaces, like a TV screen.

I’ll try switching off denoise and denoise with Topaz in post, though you can see - I can’t really imagine such massive differences can be caused purely by the denoiser, but what do I know :sweat_smile:

It looks like someone’s playing Tetris there! Edit: I used 16 px tiles to accommodate the CPU.

(I’m in my phone atm, hope I didn’t screw up the post too much)

Can you update what CPU, GPU, OS and Blender version you have?

But either way, that should NOT be impacting the renders. as such most likely a bug report.

Guessing also if you render at lager tile sizes the “tetris” issue just gets bigger?

Lastly. if you render sepeartely, which one is rendering the ligher and which is rendering the darker version? This would be useful for the bug report.

If you have someone who also has blender, can you get them to run it as well to see if this is a potential material issue or a general bug?

I’m running Win 10, EVGA 980 GTX Classified, AMD FX 8350. I’ve been working this project on all releases of 2.93 LTS (.1-.4).
I’ve rendered this also on 64 px tiles (early render in this reply) - you can see it in dark refelctive materials as well as the sofa.


I have more examples if needed.

The lighter patches, I believe, are from the GPU, since the CPU tends to render 8 tiles in a block.

I have only one cumputer with a (workable) GPU, the 2nd is identical and doing slow renders on the side, so I can’t test, and I also don’t know anyone using Blender. I have a very old Win 7 laptop though and let it run a couple of days…

Have you tried GPU only? The reason I ask is that your GPU should be rendering much faster than your CPU according to a quick look at Blender Benchmark data. Using the CPU seems pointless and if adding it in causes these issues then it would be best left out of the mix.

Based on technical feedback, it sounds liek a but, and I’d recommend raising a bug report.

Before doing that, I owuld create a simple scene, with few basic cubes and the material applied.

Test to confirm the issue is still there, and if it is, share it here and i’ll double check on my setup.

If I do not have it, it is a bug related to your hardware setup, if I can confirm it is a larger bug.

Yes, I’m re- rendering everything on GPU as we speak. Slow, but at least decent looking. Still a lot slower than CPU&GPU.
Before this “adventure”, I was rendering GPU only, but after a couple of tests with tile size and other people claiming how much faster GPU was, I resorted to mix. On my identical, GPU-less other computer, I render an animation frame in 31 minutes that takes 6 minutes with the mix, but I have no other choice atm

All right, I’ll do that tomorrow or whenever my way- past- deadline images are done.

The materials came from different sources, Blenderkit, Archipack pro etc, so I don’t think this can be a material issue, but then again, what do I know :tired_face:

This was maybe reported before, I found another thread on that very issue from a couple of years ago, and another one, where everybody seemed to agree, that if you calculate 1+1 on a GPU, the result had to be 3, as it was a different device than the CPU.

Whatever the “solution” was always to dump half your resources and render GPU only :man_shrugging:t5:

1 Like

Just out of interest… what does your Preferences>System setup look like? Also what does your Render Properties panel look like.

I know in the benchmark screenshot I posted above it doesn’t show your exact CPU… but you can clearly see the times for the Classroom benchmark on 2.93 for vaguely similar CPUs are in the thousands of seconds timeframe while those for your GPU are in the hundreds of seconds.

GPU only for you should not be a lot slower than GPU+CPU. It should be marginally slower and in many cases about the same. When CPU is rendering 10x slower than GPU then what happens is you just end up waiting for the CPU to slowly finish its last tile.

Here are my settings:



My system is 10+ years old, but at least AMD made sure I could upgrade my 15+ years old stuff with 10+ years old used stuff more than a decade later :joy:
The board is PCIe2, the GPU 3.0, I’ve read it doesn’t make a huge difference.

This is why I’ve always shied away from Blender/3D in general. Great enthusiasm, until the hardware barrier hit me like an oncoming truck. Since 2005, I’ve always watched everything from ivy generators to cycles renders and Eevee viewport at least 10x faster than on my respective computer, which always worked perfectly for graphic design and photography, but since the “pandemic” killed off half of my old clients, I’m forced to move on in a tortuous, torturous 16-18 hour grind with 2 lame-ass rigs burning through my energy bill :sweat_smile: Far off, but within reach, I see a gleaming light: Ryzen… no, Threadripper… 3090… no, 4x 3090! Or wait: that’s an insurance bill right there :joy: :joy: :joy:

Maybe try the recent 3.0 nightlies, it has the new Cycles-X render engine. I am curious to know if you would hit those blocks with the updated engine.

I’m just gonna render all CPU now. I’m sick of it.

@Grzesiek I’ve now tested: CPU + GPU with/without denoise: I get a checkered result, see attachments.
GPU only would work, but wants to take about 2 times+ longer than CPU+GPU, I canceled after 9 hours and a projected more 18 to come at the recommended tile size of 256.

So, I’m giving up on GPU rendering for now!

@kkar Cycles X… I don’t know. I haven’t read about it in depth, but it seems more like a cheat with good intentions than a quality renderer. But I’ll try, one more experiment and I can’t even get a Renderstreet addon for 2.8+

What a DRAG.