I want to make a marker-based Facial Motion Capture script for Blender 2.49 (will possibly port to 2.6 when its out).
I’m going to use Python Imaging Library to choose a color from a recorded video and apply contrast to get the approximate marker positions, then get the marker positions each frame, make empties for markers in Blender and apply the motion data to empties, smooth the results. Later bones, etc could inherit the motion from empties with constraints or if parented.
I would like to be able to make the script work in realtime. Record a video from camera and send the frames directly to Blender and show it as background image in the 3d view or at least in another window and allow users to record animations in realtime.
Firstly, I couldn’t find any way to get frames from cameras with PIL, do you guys know of any other library? Also can you send and view the video inside Blender in realtime?