I’ve been hearing a lot about network rendering, and was wondering: is it just for animations, or can you make it so multiple machines work on one image at the same time?
Also, do the computers all have to be the same? Or can they just be a bunch of random ones?
As far as I know, until Blender offers “parts rendering” in the command line most distributed rendering options are only useful for animations. The controller cannot divide one image into separate parts to distribute to the agents (separate computers) in the cluster.
It depends on what technology you are using. I use Xgrid, so only Macs and UNIX/Linux machines can join the cluster.
Other solutions use different technology. Many of the current Blender offerings are Python based, so as long as things are setup correctly they are cross platform.
I only have Blender distributed rendering experience with Xgrid, Macs and NIX machines. Maybe someone else can answer about the other platforms…
which blender do you use? main tree? what baout the instinctive version? as far as i know this one can split up an image into smaller parts and let it render by slaves!
I am excited to try the instinctive version, but I have been too busy recently and haven’t had a chance. It also offers some other features I have been looking for in the official builds, so I want to get to it soon. Hopefully soon I will get a chance…
Claas, you’re a Mac user… how is the new instinctive Mac version?
mh the version works quite fine, what i cannot say for the offical blender version! the only thing that is odd is that the interface somehow seems to not draw elements and thus seems to have some drawing prolems. but i do not know if this is a bug or acutaly just a color theme issue!