GPU too weak?

So I am trying to render a scene that is pretty memory intensive with over a million polygons and a big smoke simulation. I cannot render the scene because of this…


What does this mean? Is my GPU simply too weak? Is there a GPU that can handle so much memory and if so can you give me the name? I’m concerned cause I thought this is what GPU is for, to handle heavy scenes better than the CPU. Please help! Thanks :slight_smile:

how are we supposed know? its not like you gave us any real info to use. cpu, gpu, ram, os, blender version, render settings, full scene specs, and gpu driver version for starters!

gpu is usually faster then cpu, but cpu uses system ram so its more likely to handle bigger scenes better. in the case of blender, gpu features are usually behind in efficiency and features.

Haven’t checked for a while but to my knowledge, 12GB is the maximum amount of memory you can buy for gpu rendering. Single core cards with 12GB like Titan, or dual core cards with 24GB of memory (24/2 = 12). Adding cards doesn’t increase the amount of available memory.
https://docs.blender.org/manual/en/dev/render/cycles/gpu_rendering.html#would-multiple-gpus-increase-available-memory

Those also have limitations in texture sizes and amount of textures.
https://docs.blender.org/manual/en/dev/render/cycles/gpu_rendering.html#supported-features-and-limitations
http://docs.nvidia.com/cuda/cuda-c-programming-guide/index.html#features-and-technical-specifications

I’m guessing you have about 1 - 1.5 gigs of video RAM on your card and you ran into the dreaded out of memory error. Larger/more complex scenes will cause this to happen and eventually you will have to fall back on CPU rendering of the scene. OpenCL that can use system memory as well could be the solution for you when rendering to GPU, other than that you might want to look into a new GPU. Keep in mind that gaming cards, while performance is very good, have a smaller limit of video RAM, maybe 3-4 gigs, while professional cards are released with huge amounts of video RAM, some up to a TB but come at a premium price, usually more than my entire system put together.

Please link to one.

nvm. https://pro.radeon.com/en/product/pro-series/radeon-pro-ssg/

Interesting development. Nice that someone is addressing the issue and also good that it’s AMD for a change. Nvidia and its puny 12GB cards need a smack.

Largest rendering projects are processed in data centers with thousands of cpu’s and on Linux. Big studios have their own. For individuals getting access to such hardware means render farm services, there’s also interesting development related to that https://www.blendernation.com/2017/09/09/cycles-new-network-rendering-640-threads/

https://www.newegg.com/Product/Product.aspx?Item=N82E16814105088&ignorebbr=1&nm_mc=KNC-GoogleAdwords-PC&cm_mmc=KNC-GoogleAdwords-PC--pla--Video+Cards±+Workstation-_-N82E16814105088&gclid=EAIaIQobChMIhoDU_eeu2AIVj7jACh1xjwB8EAQYBSABEgLUNPD_BwE&gclsrc=aw.ds

The P6000 Quadro and the AMD Frontier Edition both have over 12gb - 24 and 16 respectively. Though obviously both are waaay out of the average users price range, and neither is particularly practical for various reasons.

Yes, haven’t looked in a while. P6000 seems to be a single core card with 24GB of memory.

A bit more than a typical nvidia upgrade, but still typical that they milk the prosumer market first and slowly add features to the lower price range cards, which are still expensive compared to a typical gaming card. All new Titan Y with 12GB of memory and added 1 byte of GAYDDR5 with the price of $1500, and the price doesn’t include a working driver. Thanks nvidia.

im sure theres a better thread you can take this too. i dont see how discussing pro cards is getting the OP any closer to solving the issue at hand.

Well, actually the OP asked for it: “Is there a GPU that can handle so much memory and if so can you give me the name?”

Since we know almost nothing about the OP’s scene and its actual memory consumption nor about the amount of memory his current GPU has, I guess thinking big is completely in order…:wink:

Also, no one has addressed the OP’s basic misconception.

johnnygibbs if you don’t have enough VRAM you can possibly complete the render using cpu and system RAM. Or if you have the money, then by all means go shopping. :slight_smile:

I am so sorry. It was foolish to not post any specs. NVIDIA GeForce MX150 is my GPU. My computer is an HP Spectre 360 with 16 GB of installed ram. I think GPU memory size is 2GB but I’m not sure. The size of the scene is here with memory peaking at 1216.90M when rendering…


Thank you all for the help so far. I hope this can help find a better solution.

Also the scene can be rendered on CPU. I just wanted to see if I can fix this now sense my projects are getter bigger and more memory intensive.

There’s really nothing to fix. A 2 GB GPU won’t handle resource intensive scenes better than CPU and system memory, because it doesn’t have enough resources. 2 GB VRAM is simply not enough.

To render more memory intensive projects you need to have more memory. Or optimise the scene so it will fit into the 1.5 GB available VRAM.

Dang…So whats the max memory you can have on a GPU card?

Many times blender doesn’t show the correct amount of memory it is using. I have had many times when blender says it is using 1GB and when I look in the Nvidia settings it is actually using 2 gigs.

That may not be working how you think. Its integrated with the card, unlike system memory which you can replace individually up to the supported limit. The amount of VRAM available depends on the card and once you get it, that’s it.

As others noted, getting something “big” is going to cost a lot of money for something that’s going to be virtually idle most of the time. If this is just hobby work, then you’re probably looking at a card in the 6 - 8 GB range for cost effectiveness. If you’re still coming up short with that (after optimizing) then it’s time to start looking at online render farms for the final pass, and simplified settings for checking. If this is professional, you’ll need to check your budget to see if you can recover the additional costs of the higher priced cards.

Regardless of whether or not a scene will work on your card it’s still good to get into the habit of optimizing. Maybe not to the nth degree (diminishing returns and all that) but keeping things efficient will reduce problems.

Max memory can go to the point where you need to sell your house car an arm and a leg to afford one not to mention the power consumption required to run them. However most people are getting by with just using dual cards like the gtx 1080ti. However now that blender is starting to bring in decent support for AMD you might find a cheaper solution with one of AMD’s cards that is comparable.