Animate Rig Using Azure Kinect Motion Capture Data

I’m looking for someone to create and animate a rig using joint position and orientation data captured using the new Azure Kinect. The joint hierarchy is detailed here: https://docs.microsoft.com/en-us/azure/kinect-dk/body-joints

Unfortunately the Kinect does not output data in a standard MoCap file format. It’s really just a list of positions and orientations of joints/bones as in the link above. But I looked a bit into Python scripting so I can work with you on a way to get the data into Blender using Python.

We could start with a demo sequence but subsequently I may also be interested in more elaborate projects, as well as tutoring on how to do it myself.

Best,
Marcelo.

This topic was automatically closed after 90 days. New replies are no longer allowed.