Tracking and stabilising. Mutually exclusive?

I’m having a problem with some footage for a music video. It was shot in less than ideal conditions and there’s no chance of a reshoot.

The footage needs both stabilising and tracking. Should I stabilise first then track, or should I track first then stabilise?

Basically, if I try to track stabilised footage does the tracking take the stabilisation into account, or does it track the original unstabilised points?

Can anybody suggest a workflow that I can adopt to make the best of this? As I say, reshooting it or binning it aren’t options.


You must track first if you expect to reconstruct the scene in 3D. Blender needs to understand where the camera is in space, it works this out backwards from the geometry of the objects you shoot relative to the camera location. The camera location is assumed to be a known value.
If you stabilise the shot first then the centre of the image drifts around the frame, this is the same as the camera drifting around in space. If that happens then the “known value” of camera location fails to be true.

Also this means that tracking stabilised phone cam will often fail.

What 3pointEdit said. Track first, then stabilize. If you have camera motion from tracking, it is possible to use it for stabilization. For example by smoothing the rotations and “re-rendering” the image through smoothed cam by either using an image plane attached to original camera or reconstructed 3D scene.

Thanks guys that makes sense. It has just occurred to me that I could then apply a 2d stabilisation node to the render layers and the masks in the compositor too. Sometimes I can’t see the wood for the trees!

I think I can regard this thread as solved now. Thanks again.