I have bought a nvidia Gtx 1070 a few days ago. I have a gtx 960 too. So now i have this two cards working in my pc. I have now seen that Blender uses my Gtx 960 for rendering…in my actual project it have the cuda out of memory error. When i switch off the Gtx 960 in the preferences Blender uses my Gtx 1070 and this card renders my project nicely. So do someone know how i can setup Blender to use Gtx1070 for rendering as default and the Gtx 960 for the viewport???
a.) GPU card that has monitors plugged in will drive your display(s).
b.) GPU set under User Preferences > System > Cycles Compute Device… will be used for computing (rendering). More… GPU Rendering — Blender Manual
You can’t split computing job (Interactive + Final render) between cards, it’s all or nothing.
PPS (only about computing jobs)
Well, you can bypass this by having two Blender iterations of same scene open & set GPUs differently. Or in simple way, change when it’s time to do final render.