Motion tracking and camera zoom

Hey guys.Hope you are all good.

Now to the point: I am using PFTrack and Blender to make a compositing film. Everything is ok so far. But i am always facing two problems that take way too much to solve and probably with the wrong way.

1st: When i have a shot in which i zoom in with my camera the objects tends to move like it is flying in the video and the only way to keep it in its steady position is to play with the axis values(possibly the wrong way). Is there a way to keep the object stable while the camera zoom sin (of course there has been tracking, estimation of focal length and solve motion before i import it in Blender.

2nd: How can i make the object move or be behind objects in the video? For example have a little character walking n the table. Can i make him walk behind a bottle that i have in the shot and not in front of it ? I couldn’t find anything about it unfortunately.

Hope my questions were understandable enough.

Thanks in advance for your time and effort.

  1. a tracking tool like Voodoo knows how to change Blender’s lens size to match the zoom of the real camera. I am sure PFTrack has a similar feature, but cannot say what it is as I do not use PFTrack. I have not tracked a zoom with Icarus (pre-PFTrack), but look for something to set the lens size.

  2. Best to shoot the bottle in front of a green screen and then comp it as the front-most layer. Or make and track a mask of the bottle, using it as an alpha mask to mask out your scene of the guy walking. The other way is to make your own dynamic difference mask, which I did for the film I am working on, but it’s pretty complicated.

One note about zooming when matching real-life video to CGI – make sure you get as close as you can to the actual focal length of the video camera used for the live action footage. If the Blender camera and the real camera are mismatched significantly in focal length, the perspective of the live scene and the Blender scene will not match, and for some subjects, this can make a successful and convincing composite much more difficult.

The Blender “Lens” setting in 2.49b (I haven’t yet tested 2.5x since the API isn’t yet stable) is not an accurate match to a focal length (in mm) of a real camera lens. This can lead to perspective mismatches, so I wrote a script to help out : BLenses. It may be of some assistance in getting a good focal length/perspective match after PFTrack does its thing.

Also note that any camera track/motion track software can only estimate focal length based on data points in the live footage. If this estimate is faulty, it can lead to the “flying” effect you see during zooms, because the live camera and the CGI camera focal lengths do not match properly. Better to record focal lengths/zooms during live shoots when possible, and use this data to correct for what the tracking 'ware churns out. Some cameras record such information as metadata, simplifying that process. The BLenses site mentions this on page 2.

I had some similar issues using Voodoo to do some CGI composite testing, and ended up having to correct for both focal length and camera motion to a significant degree.

PapaSmurf, a dynamic difference mask!? Via mocap track data? I often wonder if you could do that with point cloud remaped vision, extracted from a track session. That is, reconstruct the background with an assembled version of a scene from a volumetric reconstruction. Then diff matte the foreground subject out. Not likely I guess but I have seen image stabilising software that reconstructs a camera move and reshoots, so perhaps?