GSoC 2011 - Improving Motion Capture workflow - feedback and updates

On the assumption that the retargeting was all written in python, I moved the files across and was able to map the bones between target and destination rigs. Changed the BVH axis. Converted the keyframes to beziers (took a while, but I expect that with Python-only plugins & numerical processing). Ran the de-noise functionality. Ran the “loop animation” to cut the walk animation to a single left/right step. All worked fine up until that point.

Then I ran the retarget - which took some time then crashed telling me that the stride_bone did not have an animation property (it was a “NoneType”). The resulting animation was completely toast too. My character completely twisted and flopping about like a dying fish on a wire. I expect this is because the animation didn’t finish converting.

This was with the latest version sync’d with the source repository this morning.

Regardless of whether I set feet or not, the stride_bone issue crops up. Here’s a screenshot of the exception:
http://img708.imageshack.us/img708/5835/screenshot20110810at950.png

Is this the place to get the dev’s attention, or should I send feedback elsewhere?

Normally here would be fine… why not try the soc mailing list or irc?

Didn’t know of the GSOC mailing list (off to find it now). My experience in IRC and getting a hold of someone has not been good. Australian developers I tend to get a hold of easily enough, but no matter what time I tend to come online - no-one is around (well never the one I am after).

I’ll try the mailing list, at least peer-pressure kicks in when relevant questions go unanswered in them :slight_smile:

here it is in case you didn’t find it
http://lists.blender.org/mailman/listinfo/soc-2011-dev

I just wanted to say THANK YOU!!! to Benjy and ThomasL (I LOVE MAKEHUMAN!!!) for all your hard work. Thank you Benjy for doing this. People still throw crap at mocap, but I think this project just might change that.
I created a makehuman girl and tried to have a go at retargetting using the new Pepper method, but I couldn’t get it to work. I know that’s due more to my own ineptitude than anything else. I deleted the makehuman armature, and simply used the BVH armature as her new skeleton. It worked very well, but it is certainly an extremely crude approach, and doing anything other than the BVH motion (just walking, in this case) would require a whole redo from scratch of all controls in order to do anything like facial expressions or talking or, well, anything.
So I’m very glad to see you guys plowing ahead with Pepper.
I’m sure I’ll be one of the people hooked on Pepper when it gets out of drydock…
THANK YOU AGAIN!

well i want to say thanks too, for the great and wonderfull work, i try to test to retarget a mocap skeleton into a custom eskeleton or into the same mocap skeleton, and ever i have errors:

Traceback (most recent call last):
File “F:\c\edicion\creacion 3d\blender\cvs\GSoC-Pepper_64bit-39256\2.58\scripts\startup\ui_mocap.py”, line 378, in execute
retarget.totalRetarget(performer_obj, enduser_obj, scene, s_frame, e_frame)
File “F:\c\edicion\creacion 3d\blender\cvs\GSoC-Pepper_64bit-39256\2.58\scripts\modules\retarget.py”, line 442, in totalRetarget
stride_bone = copyTranslation(performer_obj, enduser_obj, feetBones, root, s_frame, e_frame, scene, enduser_obj_mat)
File “F:\c\edicion\creacion 3d\blender\cvs\GSoC-Pepper_64bit-39256\2.58\scripts\modules\retarget.py”, line 298, in copyTranslation
stride_bone.animation_data.action.name = ("Stride Bone " + action_name)
AttributeError: ‘NoneType’ object has no attribute ‘action’

location:<unknown location>:-1

Sorry for not being here, I see posts picked up recently.

AdamEtheredge - I’ll have a look at retargeting to the MakeHuman rig. Changes I made today, with the “advanced retargeting” method will probably work, if the usual way doesn’t.

wicked208 - That was fixed earlier today. :slight_smile:

BTolputt - replied to your email, sorry again for the delay and thanks for your enthusiam.

marcolorenz - I was aware of that issue, but thought it was fixed :frowning: I’m on it!

Quacky - this project deals with the data files that Motion Capture sessions/hardware create, such as BVH files. However, perhaps Sergey’s Tomato Branch (3d Tracking) might be able to create the required data, if there is support for multiple camera solving, which I’m not sure about. Also, there are free programs for using Microsoft’s Kinect for creating BVH files.

Forseti2 - The actual tools accept any Blender armature as “input”. So long as there is an import script in Blender for your format, you can use my tools. You could probably even use (human) keyframed animation! If there isn’t an import script for your format, contact me or Campbell (ideasman_42) who is in charge of import/export scripts, maybe we can write a script. Note that there is another GSoC project, also in Pepper, which is improving Collada import significantly, and that’s becoming a popular format for animation pipelines.

On a general note, I want to thank you, the other users/devs for your feedback, GSoC is really about giving back to the Blender community. I only ask you to be patient for another week or so. Coding ends in 4 days on August 15th, and then the final deadline is on the 22nd. During that week, I’ll be heavily commenting the code, putting up resources in terms of a “user manual” and a more technical overview for people who dabble in Blender Python. And of course, more video tutorials! (If you can stand my New York accent :wink: )

On a side note, it looks like the final deliverable will be an offical Blender add-on, with me as the maintainer of course. I do not know when it will be available in trunk (the “main” Blender version), but I’ll make sure zip files or builds are available.

No problems with the New York accent, mate. Can’t flick the cable on over here without there being at least one channel showing a Law & Order serial :stuck_out_tongue: Maybe you could work in the “dun-dun” noise just for giggles! :smiley:

On a side note, it looks like the final deliverable will be an offical Blender add-on, with me as the maintainer of course. I do not know when it will be available in trunk (the “main” Blender version), but I’ll make sure zip files or builds are available.

I was wondering about that, given the code seems to be focused purely in four(?) Python files. An add-on like Rigify would be a good thing in my mind, especially if it becomes an official one (i.e. we can complain when Campbell changes a Python API and breaks it :p)

“I’ll have a look at retargeting to the MakeHuman rig. Changes I made today, with the “advanced retargeting” method will probably work, if the usual way doesn’t.” -Benjy

WOW! Hey, Benjy, thanks for the quick response! This is neat! I feel like I just got a personal note from Alan Shephard or something! :smiley:
Great, okay. Well thanks for having a look at Makehuman into Pepper retarg. You get that going well, and you’ll have armies of people downloading Blender for the first time! Makehuman is a natch for this sort of thing, because of the Sims-like way of quick character design, and Pepper retarget is a great approach to simplifying something normally very complicated.
Let’s see, the main problem I had with the Makehuman armature was figuring out what to do with all the extra handles when it came to the Pepper retargetting method. All those handles should turn out to be very useful, but to me it was like having 8 different handles on a car door during a hurricane. The original problem I had with retargetting was that, after I had assigned matches to the analogous (or what I hoped was the analogous) bone between the BVH and Makehuman armature, I would hit Retarget and the BVH would just go walking off leaving my Makehuman and her armature just standing there.
Like I said, I’m sure all this was due to my own ineptitude. This was the very first time I had ever experimented with Mocap.
But I really wanted to see if I could do it, because there is a lot of character animation in my upcoming projects, and a lot of it is action stuff. Perfect for mocap.
But, like I said, I ended up just pulling out the Makehuman armature and sticking the BVH in her, and got this… (WARNING: NAKED CARTOON CHICK ON THE PROWL…)


Yes, I am a total mocap nOOb. However, it seems like there ought to be a way to make retargetting work, even for nOObs. To me, it seems like there should be some super-simple interface, like a 3Dish template that pops up on the screen in the general outline of a human. Maybe it would tell you to “Insert your Makehuman here…” and you would slide your Makehuman into the little template. And maybe then slide the BVH into the template, hit the button, and it would find the nearest matching bones inside each chunk of the template and superglue them together. I don’t know. Something like that.
I know it’s probably asking a lot. I guess it sounds like a cross between a transporter platform and AI.

At any rate, I think guys like you are HEROIC! I’m sorry for the heavy demands we make on you to make it all even more user-friendly for people who have never heard “RTFM!”
Personally, I pray every day that they’ll make you the doorman in Blenderheaven.
Keep up the brilliant work, and let me know if you ever want a glowing recommendation letter for any request for an early PhD… (I can do a helluva Walt Disney forgery!) :smiley:

Hello Benjy,
You’re doing great job on mocap!!! hope can finish soon :smiley: because i really need it for work. well even it’s not perfect yet :wink:

BenjyCook - Thanks, but I currently don’t need any import script. It was just a question.

Hello everyone,

With GSoC “pencils down” happening today, I’ve spent the past few days writing docs, polishing code and (everyone’s favorite) creating more video tutorials!

Right now, the best way to get the project is as a Python Add-on.
The add-on is available as a diff patch in the Blender Extensions tracker here

All you need to do is download the latest trunk (r39603 or higher), apply the diff to the addons folder, and voila!

Manual, links to videos and more can be seen on my blender wiki page here

Thanks for all your support during this project. I will be actively maintaining and improving this addon for the near future, so feedback is welcome as always. Hopefully, it will included as an official add-on in an upcoming official release.

Benjy

Quick Update: All my work (as well as the rest of Pepper Branch!) has been commited to trunk.
So just download an updated build from graphic all or build your own from svn.

Has anyone gotten Motion Capture Tools to actually enable in Blender? I’ve tried two Graphicall builds and the official 2.6 RC2; in all cases I can see the addon in the list but not enable it. From my experience this usually means an API version discrepancy.

Any ideas?

Seems to work with r40991. Retargeted some motion from CMU (MB-friendly) to the Rigify metarig. No errors, but some weird angles, perhaps due to differences in rest pose.

Hey Benjy!

i was checking out your tutorials on Vimeo… Am I mistaken, or is Tutorial Number 4 missing?

Also, I was wondering if you made the little rigged character you are using available anywhere? I like to use the same resources when following along with a tutorial, before branching out and trying my own.

The stuff looks real good, thanks for your hard work!

Replied to your PM.

Hello !

First things first : thanks for your work, it helps me a lot !

But, I encounter some issues, and I’d like to have some advice about the right way to resolve them :
I have a character with the Human Meta-Rig (slightly edited to fit in a T-pose) and I try to retarget an animation imported from a bvh file. When I do so, I have mainly two issues :

  • the shoulders and the arms are badly twisted ;
  • while playing the animation, the thumbs are somehow … flying away (and it’s not the intention of the animation :p).

For the first issue, I think the problem lies with the default roll of my bones in rest pose, but if I change it, I’ll have to redo the skinning of my character, and it seems to me that we can’t afford that : what if an other mocap source give me a bvh with different bone issues ? And is it a good workflow to adapt your model (with maybe a lot of work to do so with complex projects) to an external animation resource ?

For the second issue : I’ve absolutely no idea.

You can find here a simple blend file with my situation. The mesh of the character (smartly named “Character”) is in the first layer, the armature of the character (“Armature”) is in the second layer, the armature imported (“AnimationTest”) is in the third layer. I’ve prepared the hierarchy mapping but I’ve not retargeted yet, in case you want to check some details before the operation.

Maybe I’m doing things totally the wrong way, and I’ll be glad to be enlightened, but I don’t know what else to do.