Morph images based on tracking data

I am trying to use tracking data from a few tracking points on an images series and morph these images in the compositor and I am kinda stuck :o. If there would be only simple movements, 2D stabilizing or planar track would do the job; however I’d like to morph the images in more complex ways. For example I would like to track 10 features throughout my image series, and then use this data to morph all pictures but the first, to match the first one. One application would be to exactly match three images of an exposure bracketing series, so they can be blended.

I did the tracking (on three test images I created in Photoshop, a simple grid with three red dots as features), added a few “Track Position” Nodes and a “Distort” Node in the compositor. I think I am looking for a way to convert the distance of two tracking points (e.g. frame 1 and 2) into an image which can be used by the distort node (which uses the Red and the Green channel of the distortion map). One approach would be to assume that aech pixel of this distortion map should contain the specific (distance) weighted sum of all measured offsets. Quite easy to write down, a little bit harder to code but apperantly really hard to implement in nodes.

If anybody got any ideas, I would really happy if they would share them :eyebrowlift2:. On a side note: if anybody got any tips on how to get started on developing new nodes for the compositor, that would be great as well.

Before the planar node I used hooks on mesh from track points etc. Check out my YouTube channel in the sig below. Look for morph