I recently upgraded my PC with GTX 580 1,5GB RAM (previously used 2x GTS 450 1GB SLI) and I’m having weird problems rendering in Blender with Cycles on the GPU.
I created a scene and used Branched Path Tracing render integrator, with GTS cards everything worked fine. After I switched to GTX 580 1,5GB I get almost everytime CUDA error:
CUDA error: Out of memory in cuLaunchKernel(cuPathTrace, xblocks , yblocks, 1, xthreads, ythreads, 1, 0, 0, args, 0)
It’s happening on OS X and Windows 7 x64. I tried to install newest Nvidia drivers - still getting error, downgraded to v. 301.xx drivers still no luck. Reinstalled CUDA drivers on OS X - nothing helped.
without knowing too much more about your project, do you get the same error when you turn off branched path tracker and just use the normal path tracing?
When you rendered previous on your GTS Card was it actually using the GTS card to render or did it default to your CPU? Have you tried a different blender project with a very simple scene? I doubt 500,000 tris is too much for 1.5 gigs of RAM but you never know, perhaps try just rendering a cube with all of the same settings and see if that works. At least you’ll start to eliminate some issues.
Do other people have benchmarks for this card and CPU combination?
Have you tried running any other programs which are GPU intensive (online video game streaming etc) and see if they have any playback or quality issues (might point to a faulty card/install/motherboard-gpu combination) with custom PC’s there are always a lot a variables.
IkariShinji - I also wrote a post in that thread In my opinion it’s a bug in Blender or Cycles with CUDA compute capability kernel 2.0 (GTX 580 has that, GTS 450 has 2.1) as specified here: https://developer.nvidia.com/cuda-gpus
doublebishop - I didn’t know that the Branched Path Tracing takes up more memory. Will remember that However, on 2x GTS 450 1GB everything was OK, even if they are in SLI they have only 1GB available vRAM, thats less than GTX 580 ;( and still works.
And another test with almost blank scene but with identical render settings, still crashing:
MattyZ - Normal Path Tracing works fine, see above. When I was using GTS cards everything worked and they were utilized, no fallback to CPU, with 2x GPU i had 2 threads rendering the scene, checked that because render times were shorter than on the CPU.
In terms of testing the GTX 580 GPU, I ran some tests under OS X, Cinebench, Unigine Heaven etc. - no artifacts, smooth FPS. Under Windows 7 x64 identical situation. Games, benchmarks, CUDA/PHYSX tests - no lag, smooth FPS, no artifacts.
I sold my GTX 560Ti with 1.28 GB because I cant even render the default cube with the experimental kernel.
They work on splitting the kernel of cycles, this should reduce memory usage of Cycles, but need a lot of work still.
A 1.5 GB card for display and render is to small anyway these days.
Buy a small card, GT 620 or so, for display and use big card for render only, this save 3-400 MB for render.
Close all windows, Browser alone can take 100´s of MB.