is anyone of you aware of an implementation, that supports distributed rendering in realtime from a BGE-scene? I’d like to be able to render different camera-perspectives on different machines in active stereo. This would allow me to use Blender a virtual reality-system in our CAVE.
No, the blender BGE did not support distributed rendering.
But my suggestion is to use a UDP python script communication (WSAG, Bzoo, etc) to share the position data between the PC’s.
Use one PC as master. This PC get all the input, calculate the physic and runns the UDP send script.
The other PC’s are slaves and renders the BGE-scene. This PC’s receives the position data from the master with the UDP receive script.
The stereo part should be done by one BGE instance. It is critical to have left/right synchron.
Hmm. Would have made my work a lot easier if there was such an implementation
Sure, the alternative is to write Python script that communicates changes between the Blender instances. Synchonization of the left/right-eye rendering is a big issue though. Our current rendering system (based on OpenSG) uses a framelock mechanism to do this.
Does anyone know of a mechanism in Blender that does that?
The BGE supports various Stereo modes already. You find them at the render panel I think.
I do not think that framelock is supported. The stereo modes are usually for one display only and describe how to produce the stereo image with it.
There was a similar thread some months ago. You might want to search for it.