I’m relatively new to Blender, and am looking to use it to visualise 3D animation from some motion data I have.
Anywho. I’m trying to get a Python script to import a set of stick lines (start point and end point pairs in world space) representing a rough human pose shape (legs, arms torso, and head) and convert this data in to an Armature animation.
I’m currently having trouble getting my head around how the pose bones of an object represent changes for each animation frame.
So far as I understand from my google’ing around and reading tutorials, the procedure I need is something along the lines of:
- create armature hierarchy (this part I can do)
- create key frame for location and rotation at each frame (no prob here either)
- then set up loc and quat props of the pose bones appropriatelyNow, step 3 is where I’m a bit confused. I’ve got all these points in world space, so I guess my first question is what coordinate frame do the loc/quat properties live in?
Also, as I change the frame number to build up the animation, are the successive values of loc/quat supposed to be deltas between each frame, or an offset relative to the armature’s initial pose?
Generating the armature hierarchy is actually really easy, since I can just set the editbone.head and editbone.tail properties to the start/end points of each world-space line. However there doesn’t appear to be a way to do this for each frame that I can see, but I could well have missed something.
Cheers for any help,