Netrender Script to run 2 nodes on one with temperature control feedback

Posting this here, maybe it will be useful to someone out there in Blenderland.

I have a computer that has a good render card and a good CPU for a homebrew - a Hackintosh i7 4970k and a GTX 780 Ti. I thought to myself, “self, I could probably render two images at the same time”. Indeed, a short experiment later proved that to be the case. I rendered two of the benchmark BMW scenes simultaneously (one via CPU, one via GPU), taking only 10 seconds longer to render both than it would to only render one on the CPU, at about 2 min 59 seconds. (A single CPU render takes 2:49, and a single GPU render takes 1:20. Optimizing tiles reduces times to 2:40 for CPU and 1:01 for GPU.)

Then I thought, “self, I could use this with Netrender.” I made the script (attached), and indeed I can use both GPU and CPU simultaneously.

In actual use, it is equivalent to 1.5 to 1.75 render nodes because the feedback mechanism I made steps down the thread count on the CPU when needed, and also steps the GPU down to CPU usage when the GPU gets too hot for my comfort. I find that a CPU render takes roughly 3x the time of a GPU render. When the GPU node shifts to CPU, the GPU gets a break for the equivalent of 3-7 frames depending on what I am rendering. So, during that time, the CPU is doing twice as many jobs… usually only two and they take twice as long as if I did them individually. During that time i lose efficiency while the GPU cools off.

If I had better cooling, I could even shave more time off the jobs. My temp threshold is 60 C, but I could probably go up to 65 C without too much risk.

I have written up additional assumptions, notes, etc, inline in the script attached “DeviceManager.py”. Of course, it is written for OSX in my case, but can be easily modified. You will need a helper app, and two copies of Blender to use this - one Blender contained in a folder called “BlenderCUDA” and another in a containing folder “BlenderCPU”.

See the images, below. The first shows both the logging and the CPU and GPU usage graphs after the Netrender has just started… the graph and log showing how the GPU node instance switched to CPU rendering to allow the GPU to cool. The second shows a detail of the CPU and GPU usage graph when the render is well underway showing cycles of GPU usage and cool down. You can also see that the actual CPU temps are up near 70 C even with my 60 C threshold. This second shot is after 25 minutes of use.



Attachments

DeviceManager.py.zip (4.37 KB)

Hi Jim H,

I’m analyzing your explanations and translate in french for understand.
But this kind of job interest me a lot…
Thanks for sharing.
Bye bye.

My pleasure, hope it is helpful to you.

Hi Jim,

I am on linux but I must today view your code…
I have a bi-quad Q9600 and a Nvidia GT630 4Go. Not like you but I want to try it.
Do you use soften NetRender, please?
I made the translation in french and I understand you: very usefull!

I use Blender’s built in netrender addon, if that is what you mean. I am unsure what you mean by “soften netrender”. Do you mean “software” or something else?

no, I make a mistake, I wanted to say:
Do you use “often NetRender addon”?
With a lot other peoples, I imagine… I don’t use it because it’s a lot difficult alone…

Netrender is only helpful if you have the following two things:

  • a lot of work to render
  • at least one extra computer to render on (hopefully more than one)

Netrender comes bundled and it is not that hard to use for me…but I have been using it a lot. So I am used to it.

But I think it is not too hard to use. Do you think it is difficult? Do you need help?

ho! a little difficult because I can’t work with several other computers.
(At home) Can I use an other computer(linux) with blender but without a good graphic card.
I explain: I have an other computer with ubuntu 10 and blender don’t run because his graphic card is too bad.

On my main computer all is fine. So I thought: in network, I could use it only CPU.
What do you mean about?

So, can you reverse it?

My main computer is a laptop with no graphic card. I model and test on the laptop. My more powerful computer I use for rendering with netrender. I send renders to the netrender queue. If I want to I can also make my laptop a netrender slave even though it is not powerful.

I use my weak computer as my main workstation, not the strong one. Maybe you could do the same.