python to animate bones from MOCO data

Hello Nice people of Blender,
i am new to blender and python but progress going fine.
i need to make my own moco software.
now i can track and obtain marker position in form
xyzt,xyzt,xyzt…where xyz is cartesian position and t is the frame time.
i need to transpose this movement to armatures.

how do i do the following pseudocode?

while not end of position_array
bone1.head=x,y,z
moveframe to t

basically i want to control the coodinates of the armature bones,move one frame,loop…so on…

am i using the proper technique here?
should i be learning how to make motion curves from moco data instead of trying to move the bones programatically for each frame?

help and tutorials most welcome

my details:

platform for image capture: MATlab image capture toolkit
image tracking: Matlab image processing toolkit,with custom marker tracking made by myself
platform for mechanical input puppet: USB1 quad decoder from usdigital on visual basic.
(USB1 can support 4 axis encoders,4 decoder boxes per pc,i.e 16 axis sampling at 30 khz max per PC on windows)

oops.
i meant MOCAP data,not MOCO data.

If your just starting out Id recommend you import the data to empty objects, bet that working, then apply to bones.

Bones are notoriously tricky to work with since bone data is parent relative and you need to take into account the transform space of the parents too. - so best do this once you can see the data in blender and have a basic importer to work from.

right
so i’ll analyse what you said,and will reply asap
thanks

I am interested in your results please keep posting. :slight_smile:

my MATalb-2-blender MOcap bridge is in the making and i for the moment i don’t have enough info to help.
I am a bit lost as to the lack of sample simple python code for blender.

in the meantime though,here are my practical discoveries:

a puppet input device retails at around $12,000.

but a quadrature decoder card or module is around $300.Each encoder,with 2500 increments per turn,costs 60 $.a decoder module can take something like 4 encoders.
each encoder can be used on one rotary axis.
therefore a garage-made 4-axis input puppet should be around $600.
(excluding cost of PC and frame for monkey)

hello ideas man,
so you mean i can use dummy objects (with scripted motion from python) coordinates to deform a mesh just like bones?
as in making them parent of the target mesh and weight painting?
and then i script the movement of the parents from my own algorithm?