Building a render server

Hello all of you,

Im getting more and more jobs in CG and im looking into building a seperate cycles rendermachine but one that can also do some scenes in red shift or octane.

Im doubting allot what to do. I have several options. I already have a good workstation 5900x with 3090 and 128gigs or ram. The issue is that rendering stills takes allot of time sometimes and the turnaround is fast. Also i cant render and work on scenes at the same time.

So i cant and dont want to pay for all of the stuff new but these are my options and i just dont know what is the best idea.

1 put another 3090 in my system everything faster but still the issue of not being able to work while rendering

2 buy a decent second hand treathripper system with 128 pcie lanes and a board with a minimum of 4 16x pci slots. And then start putting more cards in there.

3 same as option two but then with a board with up to 7 pci slots and keep putting more cards in over time.

The issues and doubts i have are the following

  • how do you manage power if you have 3 or more 3090s for example.
  • does cycles x even work with like 4 gpus
  • how to send the renders to this render node (ideally its not in my home but in mijn studio (way cheaper power)
  • can you mix different nvidea gpus for example a 3090 and a 3080 and a 2080ti or any other card? Does it al work together or not. And is your renders memory usage bound by the lowest memory gpu?
    -sometimes i see boards with like 7pci slots which seem great but then the are two slot thick apart so that will never work. Can you use like miner extenders for that issue to still connect that many cards.

All of these things and more make me almost not want to do it. Also letting a render farm do the work is not an option for me. I see those prices and within a couple jobs you have a extra gpu saved up. Also i like to make multiple render versions and i dont want to pay everytime.

So i can buy a decent threadripper for like 2000 euro or less depending on the model. Cheaper ones have pci 3.0 not 4.0 but i dont know how much thats a issue.

i looked for a answer on the forum but the topic i found was 10 years old.

1 Like

Running your own render farm will cost as much or more in power as doing it through a service. You might spend less if you live in certain parts of the US, but if you live anywhere in Europe, your power bill will rapidly surpass the pennies you’d spend per render with an affordable render farm. For example, Barista will charge you all of four cents for a single render of the classroom scene:

I’m not saying you shouldn’t build a render farm, but if your concern is cost, you’d be surprised- building your own is not necessarily the cheapest option


Hey thanks for the extra option :stuck_out_tongue: i charge extra for the clients for rendering and i know your absolutely right i live in europe and its awfull the prices. But there are two things that are helping my case

I have a studio where the power is cheaper.
Also my house itself is getting solar power.

But i do agree with you it is something i need to look into. I can still get a extra 3090 thats the cheapest option for now. And do render jobs that are longer to a render farm.

I just really like to keep certain things in my own care and if i want to do 5 times a animation render to test things out it im more flexible. Now i hate the fact most that i cant work while rendering.

So if anyone still know the technical answers to my questions that would be nice.

A couple of thoughts. Based on my recent testing, if you are doing animation, then don’t mix your render devices. The output from a CPU render vs a GPU CUDA render vs a GPU OptiX render will all be a bit different. In fact a GPU OptiX render on a non RTX card can be a little bit different to the same scene rendered via OptiX on a RTX card.

Having said that, if its just still images, then it likely doesn’t matter as on general viewing, no one will know the difference.

The next thing to consider is if what you render will fit (and always fit), within the VRAM of a GPU, if so, then the CPU/System can be much more simple (lower core count, less system RAM, etc).

I haven’t tested this, so someone else may correct me if I’m wrong, but I don’t think you’ll need a threadripper with all those PCIe lanes. Once the data is loaded into the GPU, pretty much all the work happens there and very little data is passed over the PCIe lane. It’s not like a game, with a constant stream of data at 100+ frames per second.

I tend to think of it a bit more like those crypto mining setups, where they need very little PCIe lanes/bandwidth, as its a single load of data and then process away till done, then load some more data. But like I said, I may be wrong, not something I’ve tested.

You could just put another 3090 into your current system (assuming you have a 1200+W PSU), again that will likely half the PCIe bandwidth, so PCIe x8 for each GPU, but if you have a system that supports PCIe gen4, then it would make no difference in speed.

What you can then do, assuming that what you want to work on isn’t the same scene/file you are wanting to render, is you just start up a second instance of Blender, then in the preferences select the second GPU only for render (this is the one that doesn’t have a monitor connected to it) and set the render going.

Then on the first instance of Blender, you only have the first GPU selected in preferences and load up whatever else it is you want to work on. It means the render time will still take the same (only 1 GPU is being used), but it’s largely self contained and should have very little impact on doing other work much the same as you are now.

Personally, if it was me, and especially if you have to also buy a new PSU, I’d almost be more inclined to just build another (smaller) PC. Nothing too big or fancy, a case that fits the 3090, a 850W PSU, a general ATX or ITX motherboard, a general 6C/12T CPU, 32GB RAM and enough storage for the OS, Blender and working files a 512GB basic M.2 drive would likely do the trick.

That keeps the processes separate (nothing worse then having a 2 hour render almost done in the background, but you crash the system while working on something else and have to start all over again). You only need to turn the system on when you need it to render and you also have a bit of a back system, should something happen to your main PC, you can likely at least keep doing some work on the ‘render node’ PC, while the main one gets fixed.

1 Like

Thabks mate this is super helpfull okay so dont focus to much on a threathripper. I did that mostly to be able to expand later on. So start with one 3090 and just keep adding them over time.

I inderstand about the vram limit now thanks. Also ill make sure i have all rtx cards.

I am curious if someone can tell more about your 8x or 16x point. I do have pci 4.0 so thats nice.

And i agree if i buy a 3090 plus psu i can just as easily just buy a second hand system with all of it for a bit more.


1 Like

Well, I can.

Each generation of PCIe is basically twice as fast as the previous one per lane.

So for example, my now old Intel 6700K PC is PCIe Gen 3, while my current system is a AMD 5900X which is PCIe Gen 4. Now both of those have a primary PCIe Slot 1 for the GPU which used 16 lanes (so x16). However, PCIe Gen 4 is twice as fast, so I could operate that PCIe slot on my Ryzen system at x8 and it would be exactly the same speed as the PCIe Gen 3 x16 slot on the 6700K PC.

Now where this really matters, is the fact that cards and hence the slots are backwards compatible, so you could put a GPU that has a PCIe Gen 4 x16 interface slot and plug it into a PCIe Gen 3 x16 motherboard slot and it will still work just fine, only it will have half the bandwidth available.

I know what you are now thinking, that’s not so good then, it will slow the GPU down. However, for the most part, it won’t. Because, even tho PCIe Gen 3 x16 is half as fast, it’s still pretty darn fast. It can move a fair amount of data and in fact, it can transfer basically as much data as the 3090 can handle or process.

Some people tested this, they put a 3090 Ti into a PCIe Gen 4 x16 slot and then slowed it down to a PCIe Gen 4 x8 slot (which as I have just pointed out, is the same speed as a PCIe Gen 3 x16 slot). The difference in FPS on a number of games, was like 1-3%, so borderline margin of error.

As such, and this happens with a lot of ‘normal’ desktop systems, if you plug a GPU into the first and second PCIe slot of the motherboard, it will drop the bandwidth down to x8 for each. But of course if the system is already PCIe Gen 4, then that’s basically two slots running at PCIe Gen 3 x16 speeds, which is more then enough for a pair of 3090’s.

I guess this then also brings up one final point to consider. If you go for a second PC and you think that maybe you would like a third 3090 at some point in the future. It could be worth just doing a little bit of planning ahead and only speeding a bit more money up front (mostly on the PSU), to be able to add a second GPU to said second PC at a later date.

Outside of then getting a larger PSU (like a 1200W) and a case with some room and good cooling, you just need to make sure its a motherboard with plenty of space between the two main PCIe slots and you don’t get 3090 GPU’s that are super thick, so ideally under 3 slots. The CPU can likely be much the same, along with general RAM, etc.

That way you already have everything in place, should you want to add another GPU at a later date.

Thanks man. Its allot to consider but most of my doubts and manic questions about pci 16 and 8 and 3.0 or 4.0 are now solved it just doesnt matter that much.

And yes i would love to be more future proof and be able to have a system with enough space to put multiple gpus in hence the treathripper with lets say 4 lanes you can buy those things cheap enough together with a good motherboard.

Im not in a huge hurry so i can wait a bit till the right system is coming along. But now i do have the information to make a informed decision.

Also i do feel triggered by the remark above about a render service.

Are you referring to my reply? I’m sorry that I’ve caused you emotional distress by suggesting you investigate power cost compared to a render farm cost, I have no idea how that happened but it definitely wasn’t my intention

Lol no man just that it something to consider its a good point you made.


Apologies for dragging up an old conversation, but I enjoyed the thread and was just wondering where you landed with it all in the end. As I’m looking for a decent setup for working whilst rendering and I just don’t love the render farm options.