First off, I’m a noob when it comes to blender.
After a lot of playing around I FINALLY got netrender to work or should I say I finally understand how it works…sort of. I got netrender to work using cycles…ok, good.
But what I really want, is for the slave machine to use it’s GPU not the CPU. I “think” I have set everything up correctly(although the fact that it’s not working is my first clue I didn’t). I have successfully rendered locally w/the GPU on the “slave” machine but when I send something from a client machine it seems to use the CPU. This is based on the fact that is takes a long time to render and in sys monitor both cores of the CPU are between 98%-100% usage. I may be missing something simple but any input would be greatly appreciated.
I have a simple renderfarm setup with the idea that once it works the way I want, I will scale it up from there. Setup:
All machines are running Blender 2.63a 64-bit
slave:
Dell 745 SFF Core2Duo 2.0GHz 4GB RAM NVIDIA Quadro 600
Running Ubuntu 12.04 64-bit
master:
Macbook Pro 2.4GHz 2GB RAM
Running Ubuntu 12.04 64-bit
client:
MacPro 8core 3.2GHz 18GB RAM
Running OSX 10.6.8
Thanks in advanced for any thoughts or suggestions