Script to render animation to .jpg's, then encode these to video?

I need to write a script to capture the animation of a camera flyby around a central object into .jpg images (say 1000). Then these need to be encoded into a video file like .avi, .ogg, .mpeg4, etc… What is the best scripting approach to doing this?

The best I can come up with so far is to:

  • Use a script to write rendered images to a directory
  • Use gstreamer + ffmpeg plugin to encode them

Has anyone done this sort of thing before?

I don’t understand why, you have your animation set up in blender, then why take the detour with the images instead of rendering the animation?

I want to do this to automate the process of viewing 3d models. WIth this script, I can load any arbitrary 3d model and make a video of it immediately, instead of setting up the scene each time.

Not sure if this will be possible in 2.5x, I know a lot of things are accessible, but not sure to what degree.

Anyway;

Render jpeg’s to specific folder

When frame reaches 1000 (or whatever frame count is)

load images in sequencer as image sequence

change out put to film codec

check sequence and render frames 1 - 1000

I’m thinking of the above approach as it would be all possible in a python script, I say possible, I don’t know if the API(s) give you that can kind of access.

It might be worth checking on the developers IRC channel.

this is already possible with the render viewport button.

http://dl.dropbox.com/u/1742071/Capture.PNG

Why not making directly a film and eventually modify the film (cut, size etc. outside of blender)?