How to attach to MODEL - Using the BVH2ARM python for quicker MOCAP?

The issue is the bone names and rotations in your armature.

BVH motion capture defines the rotation for every bone in its armature for every frame. If you have an armature that completely matches the default armature that is created by the BVH import script – fine. You are good to do whatever you want with it. (There is, in fact, a pretty good tutorial of using the NLA editor to patch together BVH motion files here: )

But this makes assumptions… if you download BVH files from different sources, you will discover that the default armature can change. Some have a hip bone, others don’t. Some parent everything from hip, others have a “root” etc… you will not be able to use these actions as they come in.

It makes sense… if an armature has a Bicep.L bone, and it rotates 90 degrees between frames, how will Blender be able to resolve it, if your armature has an UpperArm_L bone instead?

And what if you create your upperArm bone to face out on the X axis… but the default armature thinks the Bicep bone faces out on the Y axis? How can the action of rotate on the Y axis 45 degrees be resolved?

That is the kind of stuff Atom was referring to in his first post, and why BVH in blender is kind of broken today.

Unless you want to do a simple “Plug and go” sort of test, BVH will not be a solution for you. You will find that the armature that comes in from the BVH import is VERY primitive. You will have to add bones, create IK constraints…all sorts of things if you want to animate it on your own. Once you do those sorts of things, you won’t be able to just bring in BVH actions and use them.

But, I guess it depends on what your eventual goal is. Maybe you will be fine just using existing BVH files and armatures.

Hi there MarkJ,
I’m seeing how complicated BVH seems to be,
I asked the guy about his NLA sequencing,
he said that he’d put up a tutorial soon about
how he matches together each newly imported MOCAP and
makes each new amature follow the original one.
It did sound rather simple the way he does the “stitching”.

Well, check out the YouTube link in my previous post,. That is what it is about – stitching together BVH actions. But, again, it only works if your armature is exactly the same as the BVH’s armature.

Amnity and MarkJoel60, are you still interested in stitching mocap moves? I completely missed this thread and bumped into it just now… sorry for making this topic alive again if theres no need.

Well, I am somewhat disillusioned with Mocap, personally.

But the subject comes up again and again, and I know you have done some really cool things with BVH… so, if you want to jump in with some hints and tips, I’m sure everyone would love to hear it…

the point is that mocap support is actuallly not that bad in blender, because of PapaSmurfs awesome “bake constraints” script. if you abuse it in a smart way, you can do the stitching quite efficiently. and it would be even more comfortable, if he could continue developing it, so that
a) its possible to bake a number of identical (duplicated, but different actions attached) armatures into one single armature (assuming that action keys dont “overlap”)
b) the resulting armature would have ipo keys only at frames where the source armatures have keys, too. so that there are “gaps” where none of the source armatures is driven by mocap data.

could some do the honour and convince him to implement that? :wink:
and how can i do a video tutorial in blender using ubuntu? is there any nice tool for that?

cheers, boogi

Yes, Im definitely still interested mocaps, its my favourite topic at the moment, any hints clues tricks and tips welcome! I’m not quite sure about ubuntu, maybe might be helpful but I used camstasia recorder.
Thanks Boogie!

bahh… i pretty much failed. i finally got some screen recording at proper frame rate to work on my box (took 3 days, opengl is not a friend of screencasting…). now i see the 10 minute time limit of youtube is a lil challenging for this kinda stuff. i went through the workflow, connection 3 moves in around 25 minutes :frowning: i could shrink it by using the sequencer or some other video editor, but the screen recorded video seems to be messed up in the preview…
this is kinda frustrating me

Thanks for trying to make it anyways! Sorry I can’t help you with that as I’m still beginner with Blender. Is it possible to video your screenrecording with a digital camera? Don’t worry about it , not urgent presently as I’m concentrating on other animations.

If you want, PM me. I can create a spot on my Drop box account and you can drop the full video there. Then, if you want, I can host it on my site. No limits there :slight_smile:

i planned to cut away all the import/baking time and record all the talking afterwards. the best i made is a 15:45 video, connecting 3 moves. i could cut away at most like 1:30 minutes so it wouldnt be near to 10 minutes anyway, so i just gave up.
however, if drop box was a good idea, markjoel: i synced it with ubuntu one, so if you are interested in the raw video (all you hear is some out-of-sync keystrokes) you can load it at

cheers, boogi

OK… so I downloaded it, but it didn’t hav an extension. I gave it an AVI and VLC seems to play it OK… but I have no audio. Well, I have audio but no narration… is there an audio track for it?

nope. i intended to edit it and record some talking but i wont go for this any more. :frowning:
the extension is actually “.mkv”, but if avi worked for you, its alright :wink:

If you wanted to just record a narration audio file (watch your video and talk about what is happening) I can synch the two for you, and put the finished video online for everyone else…

It seems like a lot going on in the video… some narration would be helpful…

true. i dont like it either. i had a couple of missclicks, the movements dont fit too well at the end. and above all, those 3 moves are straight from the cmu motion database and have 120fps, so i had to scale em all down to 0.25 first - which is actually very off topic for this tutorial and confuses only. i just dont like to take another bunch of attempts to go through this. my graphic driver setup is also back to normal, so screen recording is down anyway.

some hint to understand whats happening:
the armatures are all the same, they hold actions that are similar at some points (e.g. all lift up their left/right leg to take the next step) - so actually it should be fairly easy to connect em, hu?
and actually it is. think about the case when 2 imported bvh armatures would accidently cross their paths so that their poses match perfectly at one particular frame. in that case, you could go to this frame, select the first armature and replace all the following keyframes by those from the second armature - it would work out very well.
and this is what i do in this video. to find a matching pose, i shift all the keyframes in the action editor back and forth. and look for a pose, where both armatures e.g. lift up there left leg for walking.
if i found a frame, the armatures are still far away from each other and point to different directions. so what to do? recalculate all action keys! and as far as i know, the “bake constraints” script is the only way to do this dirty work for us. all other ways that i checked try to keep the action keys as they are, and just modifiy the movements before or after the keys have taken effect.
fortunately, you can constrain your armature in a way that you can easily move and rotate it - just add a new bone and make it the parent of the root bone - by the “child of” constraint, not by <ctrl><p>.
thats about it. go to pose mode, grab the newly added bone and toss around your armature. move it, rotate it, scale it, slap it. if it fits to your needs, run the bake constraints script on it: it will do magic. it will give you a copy of the armature without the constraint, and it holds a new action, that is identical to the former one, but all actions keys are relative to the new position! just what we wanted! now you can copy and paste the keyframes into one armature.

hope this got you nearer to the point…

Thanks Boogie, for the excellent work done on explaining your stitching method, its so clear the quality I can understand for beginners that’s really most helpful! I’m going to try it out soon.