Curious: How this is done in Blender

I just saw a video in youtube, Just want to know how its done?
Author says he done in Blender

The description says: “Made with Blender 2.56 and PFHoe Pro matchmoving software.”

Simply put, the footage is tracked and solved with PFHoe. The camera information (translation, rotation etc.) is then exported to Blender, where the creature is animated and composited on top of the filmed material.

Exactly. And you can do the same with the free software Voodoo from Digilab, which exports a python script that applies the exported real camera motion over a Blender camera’s animation channels.

Honestly the guy didn’t even need any motion tracking, some of that shaky cam could have been replicated in a compositor.