Camera Tracker Accuracy

Hi,

I’ve seen examples of work done using Blender’s Camera Tracker that look fine but I’ve been trying it out and having some issues. The worst is a kind of occasional single-frame slippage where the track seems to miss frames or to pre-empt their movement.

Because I’ve seen those examples of good tracks I’m fairly sure it’s something I’m doing wrong. I just don’t know what. For example, I recently did a track of some test footage and got a solve error value of 0.8 (which is supposed to be good). The original footage was shot at 25fps and I output at the same frame rate. I had also taken measurements at the scene to ensure I had an accurate scale setting. When I overlaid the output on the footage in After Effects however, they did not match well at all. Some parts were ok but, as I mentioned earlier, there were points at which the track seemed to miss a frame and slide across it or the movement of the Blender camera would seem to pre-empt the camera movement in the footage.

Does this sound like the result of an obvious mistake or missed setting? I’d really appreciate some help on this.

Many thanks.

SeanJ.

Check the Tracker Graph editor for single frame errors (spikes in the graph) and remove them. Find a better feature to track. Also try generating proxy/timecode for your source video, Blender doesn’t like highly temporal compressed footage. Convert to image sequence would make it even easier to track.

Many thanks for your suggestions, 3pointEdit. I’ll give them a try.