Two-GPU setup: one GPU renders, the other GPU lets me do background tasks?

I’m building a new PC. My main GPU will be a GTX 1070, but I might also drop in my old GTX 650 to keep it company.

If I set Blender to use only the 1070, and have it leave the 650 alone, can I plug my monitors into the 650, and while Blender renders, can I do other stuff in the background unhampered? (Photoshop, watch Youtube, and whatever other stuff that would need some GPU power?)

Could I accomplish the same thing by plugging my monitors into the 1070 instead, but still having Blender ignore the 650? Or would the 650 be completely useless in that scenario?

I could easily test these setups myself, but I’m still waiting on my parts to arrive. :smiley:

Yep. Exactly what I do.

Does it also work if you plug your monitors into the primary (rendering) GPU, or you have to plug into the secondary (non-rendering) GPU?

AFAIK the non-rendering GPU has to control the monitor, otherwise you would still have the issue of a maxed out GPU also having to control the UI. And yes, this will mean that you have to switch the card the monitor is connected to between rendering and e. g. gaming, which makes this a bit tedious.

OK, good to know. And ah yeah, good point about the gaming.

Maybe I could use a KVM switch (keyboard, video, mouse). It’s a box that you hook up your monitors and video cards to, and you hit a button to switch between various inputs. In this case, I’d switch back and forth between one video card and the other, whichever I wanted my monitors to use at any given time. Back in the day, I had a Mac Mini sitting on top of a PC tower, and I used the same keyboard, monitor, and mouse for both machines, because everything was hooked up to a KVM switch. Worked great.

EDIT: wait, you can’t tell your games which GPU to use?

no, you cant tell games which card to use.

Oh wait. I could probably plug each monitor into both cards at the same time, using multiple cables / inputs (DVI, HDMI, displayport). And then effectively switch between the two cards by using the input switch buttons on each monitor.

Lol KVM.

It’d be great if Blender let you set a limit on how much of the GPU Cycles could use, so that you don’t have to play GPU musical chairs. E.g., give it 90%, and leave 10% for anything else. Apparently other GPU renderers let you do this.

All right, finally got around to testing this, and for some reason, this setup doesn’t seem to be working for me.

I’m on Windows 10. I’ve got monitor 1 hooked up to my new GPU, a GTX 1070. I’ve got monitor 2 hooked up to my old GPU, a GTX 650. Blender is on monitor 1. Its prefs are set to use only the GTX 1070 for CUDA. The GTX 650 is unchecked. I’ve verified and re-verified this before / during renders. If I open up Task Manager during the render, it says the GTX 1070 is at about 100% usage, the GTX 650 is at about 1%, and the CPU is at about 30%.

And yet, during rendering, when I’ve got other apps open on monitor 2 (browser windows, etc), those apps are basically unusable–very choppy, slow to respond, etc.

Any idea what’s wrong?

Here’s something weird: if I stop the render, and I try to use apps on monitor 2, and I look at Task Manager, the GTX 650’s usage spikes up to 25% or so, coinciding with my use of those apps. But if Blender’s rendering, the GTX 650 hits only like 3% while trying to use those same apps on monitor 2. Is the Nvidia driver basically committed 99% to the GTX 1070 while rendering, making the GTX 650 temporarily useless?

I suspect this is due to your dual monitor setup.
I assume you use those as an extended desktop? Wouldn’t that mean that Windows will somehow have to coordinate the image of both graphics cards into one singular workspace? And if so, wouldn’t the lag from the GTX1070 also influence the combined image?

Try switching off the monitor the GTX1070 is connected to. Does display on the GTX650 speed up? AFAIK you would normally not want to have the rendering card drive any kind of monitor at all: One card does the rendering in the background, the other card does all the display work.

facepalm

I forgot that you’d already mentioned earlier that the non-rendering GPU should be the one to control the monitors, not the rendering GPU. My bad :smiley:

I tried turning off monitor 1, but Windows didn’t seem to care–monitor 2 still performed like crap.

After plugging both monitors into the GTX 650, the setup works as expected: Blender uses the GTX 1070 to render, and the Windows UI is buttery-smooth on both monitors, while rendering.

Bonus: some games seem to use the 1070 even though monitor 1 (my big main monitor) isn’t plugged into it. I’ll look into whether I can use Nvidia’s control panel to force the other games to use it too. EDIT: nope, you guys were right again. The settings are available, but they don’t seem to actually work.

Anyway, thanks again.