OS X - CUDA - "out of memory in cuMemAlloc/cuLaunchKernel"

Hi

I’m recently facing some issues with CUDA.
I haven’t used Blender for a few months and when few days ago I’ve loaded some of my old scenes suddenly I got “out of memory” messages from CUDA (cuMemAlloc or cuLaunchKernel)
I have tried to load and render (both viewport and F12) several scenes and they randomly generates below messages even when scene is not very heavy…
All my scenes were working fine few months ago, I have never seen such messages before.
I can only suspect that during last months CUDA and/or OS X Nvidia drivers were updated and one/both of them is/are causing these problems.
I have MSI GTX 580 3GB and that card was working great with OS X for 1.5 years… until now.
I have verified Blender versions from 2.71 to 2.73a all throws “out of memory”. 2.70 crashes at startup so I couldn’t verify this version. All my scenes are set to Support/GPU mode.




I have updated to the latest CUDA but still have these problems.

Any clue?

Br,
Rafal

I’m having CUDA crashes too and I have (intentionally) not updated my drivers.

I’m also using an older GPU (GTX580) with 1.4Gb RAM. I’m finding scenes that would render fine on GPU a few months ago constantly crash now.

I’m to a point where my GPU is almost useless for anything but the most basic scene <sigh>.

yes, here is same, scenes not too big that rendered fine on my mac with cuda, crashs or show out of memory like you show… actually, hardly anything renders on gpu now for me, i have 2 gb on nvidacard. mAkes me mad, as i spent the extra money to get the 2 gb.

Ok, so, if I’m not alone having this problem and it’s not a matter of the drivers, then it must be some issues in Cycles, something in the engine consumes too much memory or there is a memory leak in the code.

Yesterday I have verified all my scenes and I got “out of memory” in the scene that used in peak 167MB… 1/20 of the 3072MB of VRAM.

Now, the question is if we should report this problem to someone? :slight_smile:

Hi, can you post such a scene?
I can test VRAM consumption on Linux, may it is a GPU-Z for Mac for testing yourself.
Blender does not show the VRAM usage before you render.
Make sure you are not in the “Experimental” feature set, it need huge amount of memory.

Cheers, mib

same thing here to cant render with the gt730 2gb out of memory. thought it was me doing something wrong … maybe not for once , not getting crashes though . im running an windows i7 64 bit pro asus mobo p7p55lx PCIe 2.0 slots plenty of room. i’m reporting it but I don’t know how they’ll reproduce the errors without being on my machine. if I cant get this working this week im sending it back

Complete disaster. I’m working on simple object, a watch, 104k faces at the moment.
No materials, no textures, just a mesh and one light. Feature Set: Supported.


just trowing it out there guys , im having the same problem, but on WINDOWS. and after hours of research i’ve found it’s an OPENGL driver that’s causing the problem , on a windows system . mabye its the same for OSX . the problem is that a display driver has a problem with an opengl driver taking to long to do its job , and causes the operating system to reset the graphics card . resulting in the error’s being reported . have since given up on nvidia , but thats just me on a windows oprating sys . hope it helps guy’s figuring this out on your systems . oh one more thing on WINDOWS the it all boils down to whats called a “tdr” event . its a timer on the OS (windows for me) , if the graphics card takes longer than 2 seconds to do its work it resets the card =ing CRASH

You are not the only one having this issue… It also happens to me when I try to use particle systems for rendering some grass, which is strange, considering that I’ve just got a GTX 970… Also, I’ve found this, but it doesn’t remove any of my doubts.
https://developer.blender.org/T43200

Hello, I got same issue on Quadcore + GTX 970, that was going on for a while and suddenly appear error above mentionned.
For now nothing running with Cuda and no messge displaying anymore…
That’s very strange issue, I got the Nvidia card since yesterday and that look sad, i’m disapointed.
I go on Regedit (W7) and change the Tdr to 8, but nothing change…

Still getting lots of crashes. Latest one had no hair or particles in it and only 300K vertices with 2 light sources and reduced bounces. Still crashed to Blue Screen. This time it was after I rendered a 50 samples image. About 20 seconds after the render finished, presumably while the GPU was clearing RAM, that’s when the BSOD occurred.

This is getting very frustrating indeed.

I’ve changed no windows or GPU drivers, the only change is Blender 2.72 to 2.73.

Hello.

With last CUDA driver cuda_6.5.14_mac_64 you can use Blender blender-2.71-OSX_10.6-j2k-fix-x86_64 or revert drivers to version cuda_6.0.37_mac_64 and then blender-2.73a-OSX_10.6-x86_64 works fine.
Hope it helps.

Hi there, I’m having the same issues since I updated to Yosemite. My GTX 470 with 1GB suddenly does not render anything anymore. For Yosemite, I had to install a more recent version of CUDA than the 6.0.37 recommended before.

Did anyone of you find a fix for this in the meantime?

So, I’m in the club too :slight_smile:
Scene I rendered with 2.69 I rendered now with 2.69 again with error.

I don’ remember any CUDA driver update so it can be still 6.5.18 as before.
Only OS updates, but only for 10.8.5.

NVIDIA GeForce GTX 780M 4096 MB

First render layer was 230 MB and second that is exactly the same only for correct Z-pass is set simple material for whole render.
So it suppose to be even less, but calculation stops on “distribution” with memory 5380 MB :slight_smile:

I have never been so frustrated while working with Blender than now… I don’t know what happened to the Cycles memory management but there is something evil inside the engine what drains the video memory of my GPU. Need exorcist now!

I had a problem on the weekend using latest Nvidia driver for Linux; rolling back to previous version seemed* to help.

*There were other issues that contributed to out of memory errors after rolling back the driver but these seemed to happen when GPU RAM was full as opposed to half full with latest driver. I see there is a new beta driver available that fixes some bugs - not sure if any of this relates to your problem.

Cheers

Blender release 2.74 seems just as bad for GPU crashes as 2.73, maybe even worse. Simple renders cause crashes now. Getting very depressing.

Same thing here, “out of memory in cuLaunchKernel”. Yesterday I switched from 2x GTS 450 1GB SLI to GTX 580 1,5GB and loaded old scene that was rendered fine on GTS cards. When i tried to render it on GTX 580 card Blender shows error. I tried Blender v. 2.74, 2.73. On version 2.71 scene renders OK.

Features set to Supperted or Experimental - none of them works. It happens on OS X Yosemite 10.10.3

Sorry to bring up an old(ish) thread. Just wondering if anyone has found a solution to this on OSX? I’m trying to render a really simple scene but getting CUDA out of memory error on an EVGA GTX980 4gb on OSX 10.10.4. Blender 2.75 seems to report a peak memory of 550mb when i’m about to start rendering so I should have a heap of free memory.

No change for me.
Still the same issues with out of memory on GPU, Blender 2.75a, OS X 10.10.4, both internal and Nvidia drivers…
I’m not doing anything in Blender because of this problems for months…