I need to iron out slight variations of camera position in a sequence of time-lapse images (the camera movement is caused by daily expansion and contraction of the timber building that the DSLR camera is bolted onto). I’ve successfully tracked a pattern in the images, and the tracking marker is solidly locked on to the pattern for the whole sequence. The movement ranges between +5 and -5 pixels in both X and Y; image size is 3k x 2k.
When I run the input images into a 2d stabilizer node (location only, Location Influence set to 1.0, interpolation bicubic) , it seems that the stabilizing algorithm is introducing some movement of its own - perhaps an attempt to smooth out the camera movement? In any event, if I render and play the output image sequence and zoom in on the tracked pattern, it is much less stable than the marker is on the original images. I had assumed that a Location Influence of 1.0 would offset the image by exactly the number (or fraction) of pixels given by the marker displacement, but apparently not.
Is there a way I can use the compositor to apply the exact offset (i.e the inverse of the tracking marker’s movement) to each frame?