Built myself a GPU render-farm out of mostly used mining gear


(joseph raccoon) #21

I don’t see why not, you will have some bottle neck on the CPU and you’re going to have a bit of a low ram cap on it, but for a GPU based workflow that should do just fine.

And Linus tech tips did have an episode not so long ago on modding video card drivers to work with mining cards.

(Lalaland) #22

Yes, are you referring to the 8gb max ram on that mobo ?
Dont know if thats an issue if its strictly for rendering , but I am not sure.


(joseph raccoon) #23

I am, and honestly this would do quite well for many middling sized projects, I was using an I-3 duel core with 4 960’s for a while and it did quite well.

But if you are going to scale up that much for GPU rendering, you may want to look into liquid cooling them, it will help out quite a bit in any month that is not the middle of winter to be able to simply place all that heat outside. You can find waterpumps on amazon for 5 dollars each and just set those up one per card. (actually I would play with one of those for a weekend first to get a feel for them)


(Lalaland) #24

Yes, saw the Linus episode , that might work.

Initially I thought the OP was using mining cards but now I see they are regular cards.
Are these drivers definitely needed ? Mining cards dont work as is ?


(Casey) #25

Can’t really expand on the answers already given, but I do suffer some system instability with all those GPU’s installed. Not sure if it’s Nvidia drivers in combat with AMD drivers, or EMI issues or what, but as of right now, I leave everything unplugged unless I’m doing a render, otherwise I get BSOD’s, and Unreal Engine will absolutely not run without non-stop crashes with it hooked up.

I’m likely going to abandon this idea, and do something a tad more traditional, like a HEDT Threadripper (Later this year when 3xxx comes out) with 4 higher end GPU’s.


(joseph raccoon) #26

I would look to rule out powersupply problems first actually, it can be a logistics nightmare to provide enough power to each GPU, I ended up just buying DC-DC buck converters, but they do make dedicated cases just for this.

But truthfully…you do need a real case for something like that, otherwise it is not just the EMI you need to worry about but any cable being knocked around can cause dealbreaking noise to be injected into the motherboard, the PCIe bus does connect to a controller and all sorts of other blackmagic electronics in the motherboard and some of them respond VERY poorly to that type of noise.

My render system I'm using is currently 7 cards in an older server, each card has a pcie 4-16 and is watercooled...because otherwise I could not fit them all in there.

(Mario Estrella) #27

When I was working on my project I noticed I had some problems when combining the AMD and NVIDIA drivers together on the same system. Eventually I just got another motherboard and moved the NVIDIA cards on one and the AMD cards on another.

Then again I was doing this on Linux so not sure how Windows fares with both sets of drivers. Just to be on the safe side since I was also using these systems for automated testing.


(Lalaland) #28

I am now using a mining motherboard with 4 cards on risers , so all on 1x.
Works fine. The loading times are noticeable , I do a lot of small animations , knobs for VST´s etc and my single 1080ti is faster then the 4 cards 980ti -1060 and 2x 980. For large images though the 4 card rig is much faster, I might need to try and do some benchmarking with 1 card , then 2 etc as its not as fast as I was expecting. But…works fine and no crashing so far.