How does Blender use Ram and Vram during rendering?

For context, I have 32 gb of ram and a Radeon RX 560 with 4gb of vram.
I subdivided a cube 11 times (25 million polygons) and rendered it in Cycles with nothing else but a light (in 2.93 because my gpu isn’t supported anymore).
Now, when rendering this with CPU, my total ram usage for blender spiked at 14GB, whereas with my GPU (both cpu+gpu and only gpu), it went up to 21GB (with 98% Vram usage). When rendering with only gpu, the render did not seem to start for several minutes, but I did not get an error message.
So essentially my question is this: If I buy a GPU with 6 gb of ram, at what point should I be expecting to run out of memory, given that the limit for cycles does not appear to be 6 gb, and how does hybrid rendering affect this?

1 Like

Hey, but what are your rendering settings? Probably is because you are using 2.93, which is Pretty old at the moment, and there is a bunch of new performance Improvements in the latest version

I can’t test newer versions on GPU and 3.2 crashes when opening the file, but in 3.1 with CPU, memory usage spikes at 13GB. No special settings that would affect the outcome as far as I can tell…

untitled-3.1.blend (118.6 KB)

Mmm can’t check it Right now, but I will when I get back to my desktop

And the memory usage is because you are in edit mode or object mode?

I’m in object mode, but I am testing memory usage during rendering, so I have the modifier turned off in the viewport.

I have relevantly modest RTX 2060 Super. However my card came with 8GB of VRAM. Make no mistake when buying GPU don’t settle for less than 8GB. I ran out of memory occasionally but overall I am ok.

1 Like

Have you ever had a situation where you ran out of memory, but using hybrid rendering made it possible to render with the gpu? I fear 8gb might be a bit limiting for me…

Are you sure it doesn’t crash 3.1 too?
In my case the .blend file crashes Blender 3.1.0 release and 3.2 alpha. I can open it in Blender 3.0.1 release.

I can actually reproduce the crash in 3.1 and 3.2 from scratch. Default cube, add two Subdivision Surf modifiers. In the last modifier in the stack you modify Viewport Levels to 0 and it will crash.

So apparently it’s a Viewport GPU subdivision bug (which you may have disabled in 3.1)

Edit:
Reported here:
https://developer.blender.org/T97091

I don’t use Hybrid with Optix. You can always get 12GB with RTX3060 or mroe expensive and more powerful cards. Not sure why you need hybrid render so much though.

In this case, software is using memory to store data to reduce time for math calculations, it’s using what is avaiable. Your concern for a new GPU should be for big scenes with great amount of textures and UDIM’s. Run some tests with this scene: Disney Moana Island Scene or Animal Logic USD ALAb. When rendering, if your Blender just shutdown before start rendering, means that you do have a memory issue. Normally CPU rendering solves, but if the problem persists it’s time for a better gear.

Blender demo files are also good for testing purposes.

That scene you sent in my notebook (i7-4700MQ, GT740M 2GB, 16GB RAM) hybrid rendering crashed and CPU took 1:01 min to finish.

Now the problem with that is it’s not all that much of a normal real world test. When it comes to rendering, it’s not so much the number of polygons that consume VRAM (relatively speaking), its the shaders and texture maps that will just eat up GPU memory.

Your 25 million polygon cube with no textures is nothing compared to a 2 million polygon scene with 100 2k texture maps.

1 Like