OpenPose AI Facial Motion Capture to Blender Tutorial and Script

In these videos, we will see a super-easy way of doing facial motion capture with the Artificial Intelligence open-source body tracking software OpenPose. In the first video, I will show you how to track a video of your face and generate data files and then import that data into blender onto a rig. In the second video, we take a deep dive into Python scripting for blender so you can modify the process to work on your own rig.

Results:

Part 1: Facial Mocap

Part 2: Customize the Process to Your Rig with Python!

All tools and methods are freely available. The Python script I wrote for blender to import the files can be downloaded here:

To get OpenPose go here:

7 Likes

Just added a new version that removes the glitches:

2 Likes

Hi NRK,

Just thought I’d drop a quick email to say thanks for the entertaining and informative videos about Openpose , (Until then, who would have thought such a thing existed…!!)

I’m now following your instruction and downloaded OpenPose, got it to run, on Images and a short video clip, all good fun…

Next stage is to download your python script and try out the Facial captures…

Before that, a couple of questions…

  1. I saw that your script is using the 2D data extraction for the facial rig, rather than the 3d option, is there a specific reason for that?
    (I did have a quick shot at trying to extract the 3D data from my video, but it just failed, but I’ve not had time to see why yet…)

  2. Once I’ve mastered facial system, I’m keen to try for a full body pose capture/animation. Are you planning to do any work in that area yourself, or are you more interested in the face at the moment?

Regards

Richard

Hi NRK,

Just a short follow up to me previous email, you can ignore my 2nd question. Having now spent some time reading the openpose documention, links, references etc I can now see i was naive to think 3D pose estimation from a still iimage was possible… as yet…

There seems to be a lot of other work being done to acheive it i.e Deep learning, neural networks and stuff that just blows my mind… but just not there yet.

Still, you script seems to be a good help with Facial animation, which is a great start.

Thanks again for sharing your work

Regards

Thanks for researching Richard,

I used the 2d points to create 3D. There is a LOT of information in the 2d data that can be mapped. It wouldn’t be hard to take the 2d armature data and apply it to a 3d. I can calculate all of the bone angles by projecting the 2d onto a flat plane and calculating the quaternion necessary to create the 2d data. Of course each calculation would produce two results, one going out of the page and one angled in, but some simple rule checking would fix most all of it. For instance your forarm can’t bend backward etc. I was considering starting on this, but I’m taking a break for the moment. Thanks for your interest…

this is super awesome , got stuck trying to compile all the deps for openpose but with luck ill get through that and get to try it out, thanks for doing this !

Very nice job friend. I tried this but is not work as expected, i see movement but its not mathes my input video. I think i’ve matched the bones properly (rig is with riggify) the only think i didnt do and i think thats the problem, is in the part 2 you say to reset the X and Z axes to 0 and also the roll to 0, if i do that the rig structure get ruined, i cant just set all those face bones with x and z to 0. I think there is something that i dont understand, any help please, thank you

Totally understand. I am working on a version with the auto rig pro rig. this takes some doing. Since I got it working on the rig I had, I assumed it would be pretty easy on ‘auto’ rigs. The rig I had was a simple export from Daz3D of a Genesis 8 character. They had several bones pointing out of the face and a simple mapping of the x and y was all that was needed, with these ‘advanced’ auto rigged, literally everything they did to make it ‘better’ makes it harder to do in the script. For instance, there is a chin bone that modifies the entire lower half of the face and messes up the careful positioning of the individual lips. I am scratching my head over this. I converted the chin to a simple xy displacement instead of a jaw bone rotational calc like in the script and now I’m puzzling through how to do this. It’s a problem I have to solve though for our production pipeline though because they want to use auto rig pro rigs, not simple bones sticking out of the face that are parented to the head like the script likes to use. One trick is to make the bones move not with respect to the head, but with respect to another part of the face it is dependent on. I have added the chin as a new reference for the one I’m working on, and I’ll probably have to make the corners of the mouth references as well. The script mainly uses the nose and eyes as references for other points currently. Sorry it isn’t easier, if I had spent six months on the scripting it might have been easier in all these rigs. My fallback will be to blow away the facial rig and create my own that’s just bones parented to the head sticking out of the face, but the animators may not like it.

Script is now a full addon. I just uploaded the latest version.