Two GPU with no SLI

Hi, I have a gtx 950 in my computer and I am getting a 1070 soon. I heard you can have two cards running without SLI and it will just render two tiles at a time. I was just curious cause I thought I saw somewhere it say that it will only use the lowest VRAM on the card, which would be pointless. Hopefully there is someone here that can give me a better understanding about it, Thanks.

Using SLI will give poorer performance than not using SLI
The card with the least video memory determines the available video memory you have available for both cards

Yes but that is in SLI mode I was wondering if when they were separate.

Separate what ?
I assume you cannot use the 950 or 1070 in SLI.
The amount of available memory has nothing to do with using/not using SLI

The amount of available memory is the amount in the card with the least.

Repeated from the link
1 card with 8GB + 1 card with 1GB = 1GB memory available
1 card with 2GB + 1 card with 2GB = 2GB memory available

As above… basically, scene to be rendered will be build, prepared and distributed for the card with least amount of RAM available.

SLI mainly matters in gaming performance.
There are some users who have also experienced and witnessed better viewport performance within their DCC apps (ie. max2017).
For pure computing/simulating, SLI is less efficient.

When I configure two graphics cards in SLI mode, do the graphics cards work together to create double the memory size?

No. In SLI mode, each graphics card uses its own frame buffer memory to render a 3D application. The operating system will report a graphics card frame buffer memory size that is found on a single graphics board.

more on SLI FAQ

Here is an example https://www.youtube.com/watch?v=dreR2z8Kgyk skip to 2:25 he switches to two separate cards not in SLI, thats what I was trying to ask if I did it like that with a 8gb gtx 1070 and a 2gb gtx 950 would they use all there VRAM I know not combined but the 1070 would render a tile and the 950 a tile. Hope thats not confussing the way I worded it.

When there are different cards with significantly different capabilities, usually one will be used for rendering and the other used for the display. While they can be used together, trying to work out any problems with fitting everything into the VRAM, drivers, etc. there’s usually more benefit to using the less-capable card for the display so you can keep working while rendering.

Hi kless, it does work together if the scene fits in to 950 memory but 1070 is 7 times faster than 950 (guessed).
The 950 may outbrake render time, depends on tile setting, if 950 get the last tile.
What dgoresman mention make a lot of sense if cards are so different.

Cheers, mib