Render failed: system is out of GPU and shared host memory

Hello,

I tried rendering a still image from my scene today, and got this error message: image.

I’m using cycles render, experimental feature set, GPU compute. My hardware is Intel i7-3770K @ 3.50GHz, 16 GB, DDR3 RAM, NVIDIA GeForce GTX 1070 Ti (8 GBs VRAM, 8GBs shared GPU memory, 16.0 GBs GPU memory).

Any help is appreciated. Thank you.

Hi.
Adaptive subdivision can use a lot of memory sometimes. You can use GPU-z to monitor the real vRAM usage while render.

2 Likes

Thanks. I turned off the experimental feature set and it finished rendering. I also found that my CPU was disabled in the system > CUDA settings. Not sure if that had something to do with it.

That is hybrid render to render with CPU and GPU at the same time (when select GPU device in render tab). It is useful when you have a CPU and a GPU that individually have similar render times. If one of the devices is much faster than the other, it is not recommended to use hybrid render, you would not get much better render times.

2 Likes

If you are using Windows Blender, activate virtual memory on your disk is may useful sometimes. The system might be using virtual memories to holding your GPU memory and sharing memory to woking.
if the error like “Cuda context or illegal address…”, switch off Optix Denoiser can fix it.

Isn’t it like Windows uses about 2x of your RAM for virtual memory by default?
At least that’s what I can see in my settings, which I have never changed before.

I Just came across the same situation where I ran out of GPU memory because of a million sub divisions. I realized I had the viewport set to final render in cycles. I swapped it to evee viewport and it cut the gpu memory usage from 16+gigs down to +4gigs. Which gave me enough memory to actually render my image.