EEVEE SLI/Crossfire possibilities

Hi,

I was wondering if it’s possible to use SLI/Cossfire setups in EEVEE. Games seem to require customized drivers from NVidias/AMDs end to make things work. Does (or will) something like this exist for blender? If so, how does it scale performancewise?

I know, it’s early to ask this question while 2.8 is in developement, but the GPU market is quite frustrating right now and picking up two older cards seems somewhat more reasonable than buying a new model.

As far as I know, EEVEE works only on the graphics card on which a monitor is active. I have 3x GTX 1080 TI my self, not using SLI but risers.

I found it in the Release Notes :

  • No multiple GPU rendering. Eevee will use the graphic card used by the rest of Blender’s UI.
  • Headless systems (without a display) are not supported currently. Background rendering when there is a display is supported.

Hm, that’s somewhat disappointing.

Reading the wikipedia entry on SLI and according to LordRavens answer in this thread it might be possible though. As far as I understand it, Blender would just need to preconfigure the SLI settings it wants to use and the actual “splitting” of work would be nothing more than the driver dividing the image in halfs/thirds or whatever.

I owned a SLI setup several years ago and I think it worked fine with cycles (in SLI mode). It just obviously didn’t make sense to use it, since the driver always creates an overhead.

Even if it doesn’t work, the developer effort required should be relatively small - from what I understand - which isn’t a lot :'D.

Could someone owning a SLI Bridge just test this and provide us with some clearity?

Considering how fast EEVEE is on one card, it would be crazy to cut render times even more without spending a fortune.

If you are on Windows and Nvidia cards you can render your animations on two cards. You have to start two instances of Blender via right-click on the Exe. There you can choose which OpenGL device each instance should use. Then you simply specify different frame ranges for each card and let them work simultaneously.

Sounds great. I just saw this was discussed in a different thread already. One more question though: Running two instances would double the amount of system RAM required, right?

Never really thought about it. But yes, it should.

Following your prompt I tried this out. I have tried using models with and without the SLI bridge between two 1080Ti’s.

Some more detailed info for anyone who stumbled across this topic when doing their research, I had some strange behavior when I installed my second card. Some of my Blender models had trouble loading after adding the second card, and was repeatedly receiving the same error:
"a graphics card and driver with support for opengl 3.3 or higher is required"
Despite having just updated my drivers the day before, after installing the card.
First I thought it was certainly an issue with SLI so I removed the bridge, but it continued.
I disabled one of the cards in device manager, then the models would load fine. But strangely it would work if I disabled either card, so my workaround was to disable one of the cards, load the model, save it, and then enable the card. Now it works with either or both cards, with or without SLI. This was reproducible the case for multiple models. I still have no idea why, very strange.
Hope this helps somebody dealing with a similar issue.

Hey, original poster here. Changed my google account.

Huge thanks for the effort, Tyler! But you have to tell me - how does it impact performance? It’s hard to benchmark eevee as it seems, so I threw together a small test scene with a lot of hair. Would you be so kind and try it out?

Total render time per frame should roughly be 2:00 - 2:20 minutes on a single 1080Ti and (hopefully) around 1:20 when running SLI. Can you confirm this?

https://mega.nz/#!DEMglajb!xSMEVqlAZ-9oC6NxRyYAjr-Rk_cwRmtTsB8gMOzLDs8