Facial motion capture (object tracking) demonstration: video to cartoon character

As the comment below the video indicates, this is my first attempt at facial motion capture in Blender. It is a test to get to grips with the concept and workflow - and is therefore quite rough. I specifically wanted to test transformations, i.e. not duplicating the original character, but projecting it onto a face of different proportions.

There is definitely a lot of room for improvement, but I am satisfied with the the test results for now. I have learnt a lot and will next work on improving the quality. At least I now know how to do it.

Please don’t comment on the quality, as that was not the intention yet :slight_smile:

(PS: I originally had started this thread in the WIP forum, but I think that was incorrect. The test is complete anyway. When I bring out another one, it will be for the sake of improvement and optimising the pipeline)

Really impressive!!! very cool!

Can you tell us a little bit about the process?

Thanks and good Job!


Great job, I like it +1 :slight_smile:
I would love follow some tut since i did play a lot with blender tracking feature.

Interesting! I was thinking about tinkering with it first. But then I was hoping that the method could first be recorded and then reused again and and again.

It’s a good test, but I agree with some of the comments calling for a tutorial. I played with cam tracking recently, trying to use it to map to a character, but it was just putting a mustache on someone, and it came out very bad as the tracking dots on the person wound up INSIDE the camera, so it made my test a lot more trouble than it was worth. I still don’t know what I did wrong.
A tutorial for exactly this would be extremely helpful to us, and having to explain the entire process might even really help you, as you would have to fill in gaps and explain things in plain language, which can cause an instructor to learn things he didn’t know he didn’t know.
At any rate, it’s a cool idea, and would be extremely helpful for more realistic character animation using motion tracking.

hey, great work here. the capture is very fluid, did you do any cleanup to the curves?

i’ve got a technical question; i got a workflow down for doing this with a facial rig on a character, but i’m constraining the markers to the head of the rig, rather than having the markers control the movement of the head. have you tried to do this at all? may i ask what’s constrained to what in your setup?

the reason i ask is that i haven’t been able to figure out a way to constrain the markers to the head (at least, while using a rig like Rigify or anything more complex than a head bone with a child bone) without getting cyclic dependencies. seems like you’re not having any such issues though?

Thanks for the feedback.

I’ll think about the tutorial, but I’m not sure whether I will have the time for it in the short term, as I want to smooth out the pipeline.

In the mean time, I suggest you have a look at Blender’s “Track, Match, Blend” DVD - like I did.

Great Great Great.

Very, very cool! Lovin it!

Wow! Cool!

This is fantastic!
I am going to try a setup like this for my face animation, too!


The “Link Empty to Track” button (in Reconstruction-Geometry) seems to be the answer for getting the F curve face mocap data into usable Empties in the 3D space.

Then all you have to do is parent all these empties to a master empty, and then constrain that master empty to the location and rotation of the character’s head bone.

I hope to test this out tomorrow with lip-sync and face animation for my current animation project!