To save post-production time and allow more artistic control, I have decided to render out each frame of my animation to multiple files, one file for each render pass. So, for example, the color, Z depth, and vector passes would each be rendered to a different file – I render each frame one time, generating 3 images per frame. (I know that I can render the scene multiple times to generate each pass, but that defeats the purpose. The idea here is to save out all the data from rendering each frame only one time.)
Why would I want to do this? So they can then be re-imported into Blender as an image sequence and re-combined using composite nodes. That way I would not have to re-render the scene to test my composite settings – Blender can simply read in the image sequences containing the render pass data. This could be a huge time-saver!
My workflow would look like this:
- Render out each frame of the animation. The “Image”, “Z”, and “Speed” channels from the Render Layer composite node are separated and piped into separate files.
- Re-import the three sequences of images using the Image composite node.
- Use composite nodes to recombine the three channels as desired. Tweak settings and repeat until it looks right.
- Render out the final composition as a movie or sequence of normal images.
Steps 2-4 are easy to do. It is only Step 1 that I do not know how to do; actually, just one part of Step 1. Separating the channels is easy, but I don’t know how to save out more than one file per render.
I have tried making a new scene which takes a Render Layer from the first scene and uses composite nodes to extract and save the Speed channel… but I have yet to successfully cause both scenes to be “rendered”/composited when I render the animation.
I tried linking the scenes together to make a set (in the Scene: Output tab of the button bar), but only the current scene’s composite result seems to be computed and saved to file. (Perhaps I am not using the linked sets feature correctly, though.)
I have described my problem, and my attempts at solving it. Has anyone succeeded in doing what I want to do, or have a suggestion of a new direction to approach this problem from? I think it would be very beneficial for everyone in the community to know how to do this, if it is possible.
If it turns out that this is not possible currently, then I suggest a new type of node, or an improvement to the Composite-type node, which would save the data it is being fed, as an image in a specified directory.
Maybe this is already in the Blender development pipeline, but I don’t know about it… I hope so, anyway!