I’m running into a really bizarre situation with a scene I’m working on. I can render my scene in the 3D viewport just fine, but the moment I hit the “render” button the whole thing crashes with a “CUDA error: Out of memory in cuMemAlloc()” error. I have no idea why - everything is broken up into individual render layers and the memory stats for the render peak at 101.1M (which my card has definitely handled before). All of my models are linked in from other files, so I don’t know an exact poly count but it’s a product render so I know they’re not unbelievable. I have 5 or 6 4k JPG textures, but they’re all between <1MB-5MB each on my hard disk. I’m running an Nvidia GeForce 560ti.
Note the gpu I believe uses uncompressed textures so their size will be much larger than the original jpg.
Check for modifiers only applied at render time.
Check for objects only visible at render time.
Simplify the scene/materials/textures in steps to narrow down the cause of the error.
Blender does not show the real VRAM usage. You should monitor the VRAM from an external application. GPU-Z in Windows or “watch -n 2 nvidia-smi” on Linux.
For example, open the scene in Blender and GPU-Z. See used VRAM before start render. Now start the render and quickly see the peak of VRAM reached in GPU-Z.
How much memory have that 560Ti?
I think I was able to narrow it down to a texture issue after all. I played around with different layers until I found the one that crashed it and now I think it’s working ok I’ll have to use an external app to monitor my GPU for now, I wish Blender gave more feedback when it came to GPU errors.