Cycles Render with GTX 550 Ti Not Rendering

Hi All: I have bought a new Nvidia GTX 550 Ti with 4GB and attempting to render in Blender 2.64. Viewport rendering seems to works just fine with GPU enabled. But cannot get a Final Render. Blender runs through the normal steps of building the BVH, writing, etc. but only renders black. When I check the system console I see the following errors…(attached as pics). Also attached pics of what I am trying to render and the viewport render.

Any help is appreciated.


The error messages seem to say that you connected nodes which aren’t supposed to go together… and “somebody” down the line choked on the setting.

I’d look closely at the material nodes of these yellow windows which appear black in the render, or any material which doesn’t render as expected.

IIRC there is no 4 GB GTX 550Ti and you have a Cuda “Out of Memory” error.
Is this a highpoly modell? Try to hide some parts of your model from rendering in Outliner.

Cheers, mib.

Disable progressive, use more tiles.

I could not believe too, but there it is :

If this card really has 4 GB then check node setup.
Some hardware is reporting 4 instead of 1 GB reality; then it might be too much textures going on.

I deleted all materials and got rid of the Shader error (yeah!). But still get the Cuda Memory Error (ugh!) and won’t render in viewport with CUDA enabled. I attached my NVida card specs for reference. Apparently 1GB dedicated graphics memory…and 4GB total.

Any ideas?

What is considered a high poly count? And how do I find out what my poly count is?


poly count is a messy argument! depends on what you want and what your comp can do. NOw (I’d not noticed before lol but my info says 4 gig available ram ) but I only have 2 AND thats all I can use lol big textures take up lots of ram so if you have lots of big textures they will eat all the ram!! so if you can reduce the number of and the size of the texture maps

ps on top “header bar” it tells you poly count

There is what is called “London Project” and CC licensed models of it, especially one called Goose. As i remember that is a helluva complicated model with 'fill 'er up" textures to it. Thanks to Dolf Veenvliet, Ian Hubert, Nathan Vegdahl and others for sharing!
On 1GB nv9800 it was rendering in cycles; did not test all the textures, but it did.
So it looks like you have something else going on in your file. Too high subsurf? What else? What’s the polycount number?
Now, here goes long and boring guesswork on - either you cut your model apart and try to debug or let it be done here.
One else - if you have bunch of small textures for the model - total number of textures you can use is limited - around 100. But by guessing - that’s not the case here…

interestingly I have a similar problem with a 550 ti. But in my case it is with using HDR maps in cycles. It starts loading the HDR map and then I get a “connection” error of some type that the connection to the GPU has been lost. This is an error from the graphics card, not blender, but it crashes blender. Strangely, it seems to be worse with 2.64. I can get the same scene to render sometimes in 2.63, but not 2.64. And sometimes when I restart Blender it will render. I can’t find the pattern in what is going on.

Maybe I just need to upgrade to the 580 :wink:

I have no textures (images) in use…only straight material nodes. Poly count - attached is screen shot of my blender window header (not sure which one is poly count) and render settings. Also, viewport rendering experiences same issues when I try to render with same layers turned on. I can get viewport rendering for individual layers, and even many combined layers. But when I turn on all the layers I want to render, I get the GPU Memory Error. Same when doing a regular render. Sure seems like this should work.

Frustrated and cursing Blender :wink:


One other note…I can do viewport rendering with all the layers on with CPU just fine, just not GPU. I cannot do a normal render with those same layers in either GPU (mem error) or CPU mode (Blender crashes).

You’re running out of memory. If it works in viewport but not final render, make sure that you don’t have a modifier like SubSurf or Multires creating a high number of polygons at rendertime, as these are not taken into account for viewport rendering. Are you working on a laptop? I’ve never seen a discrete graphics card with different amounts of GDDR and total memory available. 1GB is not a lot of memory, especially for a modern GPU. It will fill up very quickly as soon as you start adding textures.