I have a question about multiple scenes in blender, specifically if they can do an idea I’m having at the moment. The easiest way to ask it might be to explain what I would like to be able to do:
I’d like to make two scenes, in the first put all of my environment and characters etc etc. In the second scene I’d like to have a single model that will probably be quite complicated (ok, it’s a tree).
So I animate a camera flying around the main scene (the environment), and in the tree scene I have another camera that orbits around the tree to match the rotation of the camera in the main scene.
The last part of my plan is to set up a rendering process so that the tree scene renders a frame first to a specific png file, then the main scene renders a frame BUT in my main scene I have duplivert billboards which use the png rendered from the tree scene (for that frame) as their texture.
So the effect is, when you fly around the trees even though they’re only billboards, because the billboard texture is being re rendered each frame based on the new angle of the camera, the trees seem volumetric.
Does this makes sense? (I hope I explained the idea well enough). Can scenes share object properties like this (i.e. the camera angles). Can blender do ‘synchronized rendering’ from multiple scenes like this?