Kinect plugin used for Puppetering in blender?

I’ve seen now that live point cloud data can be shown in a blender window. (Here) So now it should be a short jump to setting it up to control an armature in real time or for recording and refining. I’m sure this is going to happen sooner or later anyway.

I don’t know a great deal about it, but I think it might be more than a ‘short jump’ unfortunately - unless M$ decide to open source their algorithms for converting a point cloud into skeletal data. According to some documentaries i’ve seen, the process is some advanced computer vision stuff that compares trillions of reference poses to the point data to try and determine the user’s ‘pose’ each frame and generate a skeleton from that comparison.

It would be very nice if they decide to open source it, but without that there is a lot of very complex work to be done to get from a point cloud to an armature. I don’t know if anyone has tried combining OpenCV with a kinect yet, but that would probably be an interesting starting point.

EDIT::
Actually I’m wrong - seems its not too far off after all:

http://kinecthacks.net/page/2/

this is pretty amazing. I did not think about this as some sort of motion tracking.
Pretty crazy and awesome if this could work.

check this out
http://www.ipisoft.com/gallery.php

I created a project for blender that uses openni for motion capture. I still need to add some more features and fix some bugs before it is complete. You can get the project from here https://github.com/nonameentername/brokap