Does motion tracking work with optical zoom?

I’m playing around with Blender’s built-in motion tracker, and so far the tracking has been excellent! (In a tripod scene, I’m getting <6 camera solve error!)

However, there’s a scene that zooms in on an object. Can the motion tracker handle that zoom, or should I look to other software to do the tracking? (Preferably free!)

Distortion will change but a zoom is effectively a crop or re-size of the original, no change to parallax. But I don’t know how Blender would handle the moment of the zoom.

Did you mean < .6 solve error? Quite frankly, 3 is considered minimum for a decent solve, 1 is a good solve and I am never happy unless my solves are in the .5 to .6 range… if I could turn off OIS in my camera, I could do even better. To be honest, I haven’t worked with a tripod shot yet, but I was just working on zooming last night, and no, the focal length of the tracking camera cannot be animated ( you can’t insert a keyframe…but I think I found a workaround - still testing ) The big problem here is how are you going to find the proper focal lengths to give to blender? Lets start with this: What camera are you using and what focal length and sensor width are you using?

Nope, it seemed fine to me but yes, it’s a 6. Reading the documentation once again… that’s pretty bad! :frowning:
(It was getting 100+ before, so I considered it good.)

The camera is a Canon Vixia camcorder - I will get you an exact model soon.
As for those measurements, I’m not terribly familiar with them - our filming is more of a quick point-and-shoot kind of thing.
I think those measurements will be available once I get the model number, which should point to an exact Canon Vixia spec page.

focal length change (zooming) is not supported by the camera solver… which is why you are getting a horrible solve… your best bet is to do two tracks, one before the zoom, one after, align them both to your scene… merge your two cameras together and interpolate bteween the start point and end point, manually tracking your scene as refinement.

it is on the todo list by sergey though – http://wiki.blender.org/index.php/User:Nazg-gul/TODO

@doublebishop… it sounds like he didn’t even try to enter valid fl and sw values for his camera… that has to be fixed 1st. As for the the zooming, yes, at present that is the way to do it. but he still will need valid focal lengths to accomplish that.

according to the specs:

Focal Length (35mm equivalent):
Photo:
37mm(16:9), 32mm(4:3); Video: 37mm (16:9)

This means that you should start by setting sw = 36 and fl = 37mm… but I think that Sony uses diagonal 35mm equivalents so if you refine for focal length, don’t be surprised if it turns out somewhere around 40mm - 42mm because blender uses horizontal 35mm equivalents which can be significantly larger…

gpaprmh, have you written a detailed overview of the Blender sensor width versus others somewhere? I may have read it and forgotten sorry.

No, I haven’t, you have probably just seen snippets here and there… I guess I should just start a thread to try to explain the whole convoluted mess…