Why im a getting a CUDA out of memory error, when the scene is only half of my VRAM

Hey guys,

Im getting a CUDA out of memory error when im trying to render my scene.
The scene is 7GB (peak)…which is way less then my videocard (which is an 11GB GTX1080Ti)

Am I missing something here? Shouldt my 11GB be able to render a 7GB scene without running out of memory?

I’d say definately maybe.

The peak value Blender gives does not work well with gpus. You should observe memory consumption with a external program eg. gpu-z on Windows or nvidia-smi on Linux.

Also available memory is used on displaying the desktop. On Win10 it is consumes from all graphics cards, because of WDM. 4K display takes a good chunk of memory. Also other apps take it too, eg. internet browsers.

Someone correct me if I’m wrong, but those 7 GB are only what Blender allocates. On top of that, the driver allocates more memory for the CUDA kernel, and that’s per CUDA core. While the Cycles kernel is only just over a megabyte, on a GPU with over three thousand cores, that goes into multiple gigabytes.

Windows or nVidia don’t provide any way to monitor this extra kernel memory usage.

Help yourself - diagnose using GPU-Z

I got a similar problem these days. I was working on a relatively huge scene and was rendering without problem, once the scene became big I had a cuda error, after searching a little bit I found that the driver should be updated, that’s what I did but the problem persisted (all this using ubuntu with a blender 2.79 that hasn’t the option of rendering using cpu + gpu).
I didn’t have enough time to continue searching so I switched to windows and finalised the work.

Yes that’s what I have been doing…it was a commercial project…so I didnt have time to figure it out so I just set it to CPU

I was wondering… in GPU+CPU mode…can you still get a CUDA out of memory error? Or will the CPU just take over as soon as your GPU runs out?

1 Like

I am working with projects that overcap my GPU RAM quite a lot (got GTX 970M 3GB) and with nightly build 2.795 and hybrid GPU+CPU turned (Open MP) on it is no problem but i had to uninstall geforce experience software. So yes CPU takes over as soon as GPU runs out. Tiles are set to small one btw (16x16) with GPU+CPU (so same as with just CPU rendering).

1 Like

Yeah thats what I thought… Im really happy that CPU+GPU has been introduced. Make a big difference!

thanks!

Yeah it is a blast! For me it is significant difference since i don’t have that powerfull GPU (and yes it is way faster than just with CPU) and even with powerfull ones like 1080 TI and 11 GB for some huge projects (archviz exteriors for example) only GPU might not be enough. Btw in Blender 2.8 CPU+GPU is standard so that’s that.

Cheers

1 Like

Ah I didnt know that…Good to hear that … I have a Threadripper 16 core so that should make a big difference

DO you know when Eevee comes out officially?

1 Like

Officialy no one knows yet but beta is pretty close right now. I am droping Blender 2.79 as soon as beta hits live.
And Threadripper looks really promising I got it from couple sources and so i am switching to amd soon aswell.

1 Like

Here are two different things that can be confusing related to this thread.
Blender from master (buildbot builds) has two new features. The possibility of using hybrid CPU+GPU in Cycles, and the possibility of nvidia GPU use RAM system for some circumstances if GPU vRAM is not enough (I understand that AMD and OpenCL were always able to do this). Is not that CPU+GPU feature itself will solve your CUDA out of memory problem, it is the other feature that is also found in master builds.
CPU+GPU gain in render time is not always so advantageous for any combination of hardware. It is usually more advantageous when you have a CPU and a GPU with similar render time each of them.
And with the use of system RAM when vRAM is not enough feature, there is a render time penalty. So the optimal thing would be if you try to optimize your scene so that it fits into vRAM. I understand that you can still get CUDA out of memory error with this feature because not all data can be shared with the system RAM (I’m not sure how it works)
And as others have said, you monitor Blender and total vRAM usage by the system with an external program like gpu-z.

1 Like

I’ve been having a somewhat similar problem… but even worse I think.
A 7Mb scene is crashing Blender on my 8GB gtx 1080.
This occurs both with 2.79b and 2.8 Alpha.
When I render with CPU it renders fine.

1 Like

Well I am not entirely sure how CPU+GPU works either but I am sure that it worked and helped me in my case (my GPU alone couldn’t handle a scene) and I am sure that it was significantly faster than just with CPU. Splitting_Atoms should probably test it that won’t do him any bad.

1 Like

Blender should never, ever crash - out of memory or not. Report a bug at developer.blender.org. Include all the info you can about your system, and the offending file. For best results, remove everything else from the file that doesn’t matter for the crashing.

2 Likes

Have any of you did the rollback, used former driver which worked flawlessly?

1 Like

Is that in official release version? Have you tried with GPU only in Buildbot versions?
https://developer.blender.org/D2056

Well at first I used official latest release (where GPU/CPU is not supported) and got CUDA out of memory issues. So I was stuck with CPU until someone pointed out nightly build with CPU/GPU. So i downloaded that experimental build and happily started with hybrid rendering until I got different cuda errors just in the middle of render. That a fixed by removing Nvidia Geforce experience software (someone pointed that out on reddit) and since then I was able to GPU/CPU render with no errors whatsoever. But if you are implying that they might have enabled feature that allows GPU to use SYSTEM MEMORY in case GPU memory is not enought (which pretty much looks like according that posted link), well that I didn’t try because i didn’t know about that :). I am deffinitely going to try that out on a scene which I succesfuly rendered with GPU/CPU.
Thanks for pointed that out!

1 Like

If removing geforce experience solved your problem, so that’s what I will do the next time I get the Cuda problem. @Splitting_Atoms Asked about the possibility of going out of memory even after using the cpu + gpu option, Yes it happened but an Nvidia driver update solved the problem, but even after this I had a rendering problem and it stopped in the middle of rendering some scenes in which I had some highpoly models (they weren’t present in the vie but still caused the problem). Thank you for your sharing.

Perhaps removing GF Experience helped because the card was in fact recording for ShadowPlay or some such? Blender is after all a 3d application.