Multiple, mismatched GPUs?

I could use some quick advice before I spent a chunk of money on a video card.

I’m a bit confused about how Cycles will cope with multiple video cards in one system, that are not the same model. Specifically, if one is an RTX card and one isn’t. I’m looking to buy a new video card, and don’t want to throw money at it in a way that won’t work as expected.

I have a GTX 1080. I get that if I throw, say, a 1060 in my machine along with the 1080, both will work in parallel and I’ll get faster render speed (but not double the speed, because the 1060 is slower). I’m also limited by the smaller of the two VRAM amounts. I actually did this as a test a while ago.

I know that an RTX card is required to use features like Optix denoising. But here is the key question… if I get an RTX 20xx card, and put it into my system along with the GTX 1080 I already have, will Blender be smart enough to do what it can on both cards, and then just use the RTX card for Optix?

Or will having one of my two cards not be an RTX card just tank the whole process and make me unable to use features like Optix at all?

I haven’t found a satisfactory answer to this, because all of the discussion of multi-gpu systems that I can find involves cards that are from the same generation.

To use both on the same render, they’ll have to use the same compute method, ie CUDA. It’ll also be limited by the lowest of the cards’ memory.
If you want Optix, you’ll only be able to use the 20xx and incompatible cards will be disabled. I think some of the unofficial builds enable Optix for 10xx cards, but I wouldn’t rely on that.

Ah, I see. So there isn’t a way to, say, do the normal CUDA rendering on both cards, but pass all of the Optix Denoising onto the 20xx? Like how Blender can render on GPU and CPU simultaneously, but only uses CUDA on the GPU?

I can’t speak with certainly about how it functions with mixed cards, but Optix denoising options are visible in the latest 2.83 build even when CUDA is selected as the compute type.

Huh. Well, looks like I may have to just get the thing and see what happens. At the very least, I’ll still have shorter render times with good old CUDA, with two decent cards working together.

Thanks for the reply. :slight_smile:

I’m thinking of doing exactly this. How did it work out?

I never did end up doing it. It seemed as if it was going to be a big pain in the rear and was unlikely to actually work.

fair enough, a friend of mine is giving a GPU identical to the one I’m using so I might not be mixing different GPUs now either.