Cool Siggraph Image Stabilisation technique

http://vimeo.com/4889710

http://pages.cs.wisc.edu/~fliu/project/3dstab.htm

This is an amazing aproach to stabilisation, and one that could one day be integrated into Blender. As I see it all the prerequisite tools are there (except for tracking data of course). But the warping of the mesh looks hard, however the results are stunning.

I wonder what you would need to do to achieve this now?

I’ve given it a try.
Thank you.
http://www.wire-mesh-fence.net

That was amazing. Very doubtful that it will feature in Blender, as Blender hasn’t tackled tracking yet.

when libmv is ready enough for Blender we’ll be able to do tracking.
and perhaps libmv developers will implement this paper.

Oh yeah, I forgot about libmv… re watching the demo video I am guessing that they are pulling a 3D point cloud to represent the location, then mapping the video on a subdivided plane and stretching it, to match the track points (on video) to track points in 3space.

Does that mean that they simulate the path taken by the camera (in the video) to emulate the movement in 3space? I think thats right?