Netrender with GPU

First off, I’m a noob when it comes to blender.

After a lot of playing around I FINALLY got netrender to work or should I say I finally understand how it works…sort of. I got netrender to work using cycles…ok, good.

But what I really want, is for the slave machine to use it’s GPU not the CPU. I “think” I have set everything up correctly(although the fact that it’s not working is my first clue I didn’t). I have successfully rendered locally w/the GPU on the “slave” machine but when I send something from a client machine it seems to use the CPU. This is based on the fact that is takes a long time to render and in sys monitor both cores of the CPU are between 98%-100% usage. I may be missing something simple but any input would be greatly appreciated.

I have a simple renderfarm setup with the idea that once it works the way I want, I will scale it up from there. Setup:

All machines are running Blender 2.63a 64-bit

slave:
Dell 745 SFF Core2Duo 2.0GHz 4GB RAM NVIDIA Quadro 600
Running Ubuntu 12.04 64-bit

master:
Macbook Pro 2.4GHz 2GB RAM
Running Ubuntu 12.04 64-bit

client:
MacPro 8core 3.2GHz 18GB RAM
Running OSX 10.6.8

Thanks in advanced for any thoughts or suggestions

bump :smiley:
i also need to know if this is possible

I have tried something similar. (with no success, but I think it was a problem on the EC2 server I have tested it on.)

With some help by DingTo, we have wrote a script to set the GPU for rendering.

http://www.pasteall.org/39314/python

Paste this script into the texteditor of your blend file (client) that you want to render.
Name the text file something with *.py (maybe GPU_select.py) and activate the “register” flag on the bar of the texteditor.

Now the slaves should render with cuda GPU. Take a look at the console of the slaves, there the rendering device will be printed as eg. “CUDA_0”.