Dual Nvidia Quadro K5200 Not Good Enough?

I read about this topic in several forums, and I have tried to read them all and followed all the weblinks of information I can find about it so far. It is just weird to me that I am having this issue.

I have a Dual Xeon x64-based system with Win 8.1 Pro 64-bit and two Nvidia Quadro K5200 (compute value 3.5) with the latest drivers (416.78). I have Blender 2.78 64-bit and Blender 2.80 beta 64-bit installed and neither will show my graphics cards under the CUDA Cycles Devices, just “No compatible GPUs found”.

At this point, I am not sure what else to do. I have read about (in the GPU Rendering section of the Manual) deleting the local Cuda include host config file (but I think I would need to create my on Blender build?) or create a build with a newer CUDA developer toolkit but these are steps I am not sure I should pursue at this point. Somewhere else I read that installing the CUDA toolkit could help but my C drive is nearly full and the toolkit is HUGE and meant for developers only.

Looking in the Nvidia Control Panel, I see where I could Maximize 3D performance by enabling SLI and PhysX(?) between the two cards. I have ordered an SLI connector and will give that a try Monday.

These Quadro cards several thousands of dollars when purchased years ago and they should still be good to go for Blender GPU rendering it seems.

Does anyone know how to get Blender to “see” these cards? Should I combine the cards with SLI or not? What is Nvidia’s PhysX and does it help?

Thanks for any answers!

SLI will not help you unfortunately, SLI is only really used for games. Your cards are based on the Kepler chip so they should be supported, it’s probably that Cycles doesn’t have a kernel for compute version 3.5. You could compile your own kernel for that version or there will probably be someone here who could do it for you. Then you just need to copy it into your Blender installation and you should be good.

Just for your information, those cards are getting old and might not be supported much farther into the future. If I were you, I would spend money on Geforce or RTX cards instead of Quadros. For the same price as one Quadro card you can get about 3 of the gaming cards, and they will be faster as well. Unless you really need the extra features that Quadros have?

I am sorry you overpaid for so called workstation cards.

they are good for high res mesh display but not really much more of what we use gaming cards for

These cards came with my “work” workstation about 5 or six years ago and at the time I was glad to have them. Mostly AE work and Lightwave.

Wouldn’t SLI combine the two cards DDR memory together so Blender sees the both cards as one big card (EDIT - I just read - it does not :frowning:) ?

We are talking about upgrading cards soon. What is the best card to get now? If I got dual or triple cards, does Blender take advantage of them as is or do they need to be linked? Thanks.

BTW, I am loving what I am seeing in 2.80. I am really excited for Blender. I am finally going to move from Lightwave to Blender. I was going to learn more C4D because of AE but not worth it now. Does, or will, Blender have a plugin for AE?

Quadro drivers come with a control panel with various performance settings. Make sure that the cards are set to allow compute use. I believe that there’s a setting for graphics-only - that’s the one you should avoid.

1 Like

Thanks, Stefan. I did find this setting and it was set to Graphics and compute needs:

Great find in there as well was the nifty GPU Utilization Graph you see there! That will come in handy as I troubleshoot. Thanks again.

I’m going to compile my own build and I’ve downloaded Ton’s benchmark files. I’ll let you guys know how it turns out. Thanks again everyone.

This is a frequent question please have a look here:

You don’t have to take further action. Within Blender you can simply mark which cards it should use to render.

Just so you know 1: You can even mix different models. I just recommend you to choose all to be either NVIDIA or AMD.
If you’d like to hear my opinion I’d get as much from the current NVIDIA cards as you can.

JSYK 2: All of the said is true for Cycles. When using Eevee it will render on the card that is driving the display. Although you can, e
g. for animation, start a headless renderjob for each card and let them work on different frame ranges.

Hope this helps