GPU Rendering with Cycles - Complete Guide

Yes, Windows does recognize both cards. Make sure you install your drivers, or at least updated ones if available for your cards. So far with my older GTX 580 Classified and my GTX 780 Ti, the same drivers work for both. Windows doesn’t have a problem with that.

Don’t quote me on this, but I think you can even use a cheap ATi card only for displays, and then have an unconnected-to-displays Nvidia card just for rendering. (Could possibly be conflicts with drivers, be careful).

I haven’t had any problems switching the displays from card to card. Windows just does it, no need to tell any software to do anything. Obviously you’ll get a black screen when you unplug the displays, for me the screen came back on pretty much as soon as I plugged the cable into the other card.

I should also add that with the Nvidia cards at least, when you use multiple cards, through the Nvidia panel you can select one of the cards as the PhysX processor for games or applications that support it, so you can offload some computation from the display card to the secondary card. At least when I do play some games, in some way both cards are being utilized.

You will see all tiles being rendered together, even from the unconnected card. When I render something using Cycles GPU-CUDA setting, with one card I would see only one tile rendering, with both cards I see two tiles rendering, and overall the scene renders much quicker because of this. You do have to set Blender to use both cards, and I believe it is recommended that you DO NOT use SLI for this.

I should also note that when I used to have just the GTX 650 SC (display card) and GTX 580 Classified (render card), that when I used them both together with the BMW Benchmark, it actually took much longer to complete. The 650 slowed everything down because Cycles doesn’t combine GPU from both cards to render a single tile, instead it queues each tile to the availability of either card – in this case the 580 did most of the work because it completed its tiles much faster. And while the 580 finished rendering most of the scene in a few seconds (rendering 3 of the 4 tiles) I still had to wait about 2 minutes for the 650 to finish its one and only tile.

What this means is that your overall render will probably only be as fast as your weakest card can finish rendering its tile(s) if you use mutli-GPU.

My original idea for using a weaker cheaper card wasn’t to increase rendering speed by using it (I already knew it would be too slow), but to run three concurrent displays (and to solve a CUDA error with my 580). It was simply a bonus that offloading my displays from my 580 actually allowed it to render significantly faster since its only dedicated to rendering.

And one more note. While my 580 did see a significant increase in render speeds from the BMW bench (from 48 seconds to 30) when it was no longer being used to display video, this trick didn’t work as well with my 780 (from 47 seconds to 42). So depending on the series of cards you end have, you may or may not see any real increase by offloading its job from displaying video.

Hi!!! i am a newbie , but , Blender works with these graphics cards?:

3 GB NVIDIA® Quadro® K4000 (2DP & 1DVI-I) (2DP-DVI & 1DVI-VGA adapter)
4 GB NVIDIA® Quadro® K5000 (2DP & 1DVI-I & 1DVI-D) (2DP-DVI & 2DVI-VGA adapter)

Which is the best?

Thank you very very much!!!
Carlos

Hey, I made some tests regarding different versions of blender.
And what I found out counts for every version of Blender:
The stand-alone version from the Hp immediately gets me GPU-options, but if I install Blender over the Terminal I simply have my CPU to get along with.
My question now is why???
Greetings

!!!I would like to post this as a warning!!! On (CUDA)-GPUs (at least, maybe CPU and AMD as well ) you can only max. dedicate half of your VRAM to geometry/textures! If e.g. geometric data exceeds half of your VRAM the render will fail. Apparently there is a fix 50/50 split between textures and geometry. I didn´t know this and this caused me a lot of frustration and bewilderment. I post it here so that others do not fall for it as well!!!

Hello Blenderartists: A preface: I am very new to Blender and I would also say that my computer IQ is well below the median here. All that being said I am very committed to becoming an advance user, please bare with me and if you can help me out please reach out. Computer info: I have a Dell Precision T3610 with 8GB (4x2GB) 1600MHz computer. With a NVIDIA Quadro K600 Graphics Card. With a 500GB ATA drive. (If I am leaving out something please let me know and I will provide it!) I am just beginning to work with the node editor and Cycles. My problem is I have two monitors using the NVIDIA card. When I am in Blender (I am using Cycles to render) and switch to Render view mode, my screens go black for a second and Blender crashes on me… I am guessing that the GPU can’t do all this at the same time. Therefore, before I sell a kidney to buy another GPU card… Is this really what is going on or is it something else? Is there a way around this? I feel as though I am leaving something out. But I don’t know what. Anyone able to help me?

Hi Blenderartists:

I just started to work in Blender less than a month ago. My computer IQ is far below the median here, so I hope that you all can bare with me and provide me with some guidance…

Computer info (taken directly from the packing slip): I have a Dell Precision T3610 8GB (4x2GB) 1600MHz DDR3, a 1GB MVIDIA Quadpro K600, a 500GB 3.5inch Serial ATA (7200 rpm) hard drive running Window 64bit.

I also currently have my computer hooked up to two monitors using the graphics card.

My problem is when I go into Blender, (using Cycles to render) and turn on rendered image mode about 5 seconds in the screens go black and for a second and then Blender crashes.

The Error I get is:
NVIDIA OpenGL Driver: The NVIDIA OpenGL driver detected a problem with the display driver and is unable to continue. The application must close. Error code: 3 Would you like to visit http://www.nvidia.com/page/support.html for help?"

I have updated the driver off the NVIDIA site but still have the same problem. I am guessing that this is because my little GPU can’t handle the screens and the render all at once. Is this a correct assumption? If so, is there a way around this? Or do I need to buy and install a new GPU card? I have read the guide at the beginning of this post but I am still a bit puzzled…

Help?

Hi Blenderartists:

I just started to work in Blender less than a month ago. My computer IQ is far below the median here, so I hope that you all can bare with me and provide me with some guidance…

Computer info (taken directly from the packing slip): I have a Dell Precision T3610 8GB (4x2GB) 1600MHz DDR3, a 1GB MVIDIA Quadpro K600, a 500GB 3.5inch Serial ATA (7200 rpm) hard drive running Window 64bit.

I also currently have my computer hooked up to two monitors using the graphics card.

My problem is when I go into Blender, (using Cycles to render) and turn on rendered image mode about 5 seconds in the screens go black and for a second and then Blender crashes.

The Error I get is:
NVIDIA OpenGL Driver: The NVIDIA OpenGL driver detected a problem with the display driver and is unable to continue. The application must close. Error code: 3 Would you like to visit http://www.nvidia.com/page/support.html for help?"

I have updated the driver off the NVIDIA site but still have the same problem. I am guessing that this is because my little GPU can’t handle the screens and the render all at once. Is this a correct assumption? If so, is there a way around this? Or do I need to buy and install a new GPU card? I have read the guide at the beginning of this post but I am still a bit puzzled…

Help?

Hi Blenderartists:

I just started to work in Blender less than a month ago. My computer IQ is far below the median here, so I hope that you all can bare with me and provide me with some guidance…

Computer info (taken directly from the packing slip): I have a Dell Precision T3610 8GB (4x2GB) 1600MHz DDR3, a 1GB MVIDIA Quadpro K600, a 500GB 3.5inch Serial ATA (7200 rpm) hard drive running Window 64bit.

I also currently have my computer hooked up to two monitors using the graphics card.

My problem is when I go into Blender, (using Cycles to render) and turn on rendered image mode about 5 seconds in the screens go black and for a second and then Blender crashes.

The Error I get is:
NVIDIA OpenGL Driver: The NVIDIA OpenGL driver detected a problem with the display driver and is unable to continue. The application must close. Error code: 3 Would you like to visit http://www.nvidia.com/page/support.html for help?"

I have updated the driver off the NVIDIA site but still have the same problem. I am guessing that this is because my little GPU can’t handle the screens and the render all at once. Is this a correct assumption? If so, is there a way around this? Or do I need to buy and install a new GPU card? I have read the guide at the beginning of this post but I am still a bit puzzled…

Help?

Hi,
So… I´ve been trying to get Blender to detect my GPU with no result what so ever. I can´t get my head around what the problem is.
The specifications and info from this thread says that it should´nt be a problem so my guess is that I got a major braindamage but im asking you in case I dont.
I´m running on Windows 8, 64bit, and have the latest driver (337.88) on:
GeForce GT 640M LE
I´ve set the GPU to all it´s standard settings but with no luck.

Is there any of you that can relate to the problem or even better, got a solution?
…or is time to call the boys in white :wink:

/ Sweed

Good to know. So if you had a three-SLI card config and you wanted speedy viewport rendering while making your scene and then speedy rendering for the final product you would have to change your settings from SLI to non-SLI? This can be done? Is 3-tile GPU rendering possible or does it need to be multiples of 2? Thanks.

Edit: enabling SLI may not even be necessary to do multi-GPU CUDA. Still researching…

For what it’s worth, I did the Mike Pan BMW test with my rig and this is what I got:

The link to the article is bad. Gives a 404

OOPS, already posted

Complete noob here. I don’t know what I’m doing, but here is what I’ve observed and it has me baffled. With all of my models and most of the ones I’ve gotten from the Internet, my new Nvidia GTX 660 doesn’t appear to be doing the rendering. When I select “Rendered” in “Viewport Shading”, it takes the same amount of time whether I have “None” or “Cuda” selected in “User Preferences”. For instance, it takes 5 seconds to render the default cube that you get when opening Blender. However, the BMW test model and a couple of other models I found on the Internet are definitely using the GTX 660. Could someone please point me in the right direction? Thanks much.

Mike Fields

I’ve done an update now that I’ve installed all four of my titans. I’m sure with some tweeking with the settings and if I do overclocking I can get this even faster.

I found my problem. Click on the camera button (Render) and select Device: GPU Compute. Thanks.

Mike Fields

I managed to build a new Desktop with Dual GTX 980 and the problem is that Blender won’t render on 980 x2 or Single 980. Does anybody have a solution to this?

I’ve already tried the " set to Experimental " thing, but still doesn’t work

to host: broken link!!!

I got a 500 internal service error when trying to access systemagnostic. The information available is confusing about the best GPU for Blender. My understanding is that GPU rendering is faster than CPU with Cycles but I haven’t found this to be the case. I wouldn’t mind spending a little for a faster card but have no clue what would be the best for my Windows machine.

GPU rendering is supposed to be faster - a lot faster - with Cycles as it normally would be with any other unbiased raytracing rendering engine. The more GPUs the better i.e 6gb is far better than 1gb but it’s also based on how many cuda cores the card has. Generally, the higher the GBs your card has the more cuda cores it should also have as well. And cycles loves cuda cores!

But it simply isn’t a case of getting a better grade card. You must also ensure that the rest of your computer components are compatible with it. I have four GPUs and the obvious component that would need to match in order to power them would be an appropriate PSU that could power them as well as all the other components.