Capture the Head! or: From empties to a smile. Facetracking.

Hey guys,

I’m pretty sure that at least one of you already had the same idea. I’m about to test the following idea next weekend:

Imagine you have a real actor. You film his head face-on with a video camera while he has reflective marker points sticking on his face. Some around the mouth, eyes, nose, chin, forehead, you name it. Now, he starts acting his face while keeping the head still and you record the face with the marker points.
Next, you import the video to blender and start tracking each and every marker point. I assume that tomato can do this as a 2d tracking task.

Now my questions:

  1. Do I need some markers to stabilize the video in order to compensate for unwanted head movement? (I bet I do need them, don’t I?)
  2. How can I translate the tracked markers to my face controllers of my 3d character?
  3. Will the tracking process give me empties for each tracked markerpoint?
  4. Could I parent my face controller bones to these empties?

You get the idea… :slight_smile:

Any suggestions/ideas/comments on this? Maybe someone already has a pipeline for this kind of face tracking? Any thoughts are highly appreciated! :slight_smile:

Best regards,

There´s a script already that does this and as far as I know its being ported to 2.6 right now.
Unfortunately I dropped the bookmark somewhere.

I know that script but my point is: can it be done without scripts just by hand using the tracker system of tomato?

I saw Sebastian König tracking a rectangular object in 2d and then matching a 3d object to the trackerpoints. He shows this on his vimeo-page. Hence, my idea is to take that a step further tracking facial points. Hopefully, Sebastian reads this and can elaborate on this a bit further. :slight_smile:


Okay, I did some testing and this is how the setup works:

The markerpoints are tracked by blender’s tracking system as you
can see in this video I uploaded:

The empties are tracked and then they drive shapekeys on the mesh.
Doing this for the whole face looks like this:

So, in general this pipeline works.
The bad thing is, that the controlling of the drivers by empties ist still un-sexy:
The movement of the empties does not translate one-to-one to the drivers. The
empties move a much longer way than the corresponding mesh part.
I played with the polynomial values of the drivers and I could tweak it to some
point - but it does not work too well yet.

any further ideas on this??


all right,
things are working now as desired. the tracked empties now controll the bones which then drive the shape keys. there is one empty at the very top which does the head tracking. see this video:

everything is up and running and the workflow is pretty straight forward.

there is only one detail left to solve: when I move the camera in on the global y-axis, the empties sort of scale perspectively. I’m not sure how to solve that atm but I think I will eventually solve this issue.


p.s.: feel free to ask how the setup works in detail.

improved minor things. basically re-parent the mouth controllers to the head bone - not to the jaw bone as in the previous video:

here the last test version rendered with cycles:

Very nice proof of concept!

thanks :slight_smile:

jawbone and cheeks are now working.
need some work on the lips/eyes though:

Hi I thad you would Like it to know I made a Helmet for Face Tracking meyseltf…
I habe Till dam Isseues… for Exampel a Test But the Test Image work
I use a very cheap Ski helmet
same Scews and staff from the bulding center
same parts of “Metallbaukasten”
in the moment I une a very cheap 808 car keys Mircro-Camera
and finaly micro SD card

after all is was samthing about 40 € It you have to Bye everthing new

hi dracio,
looks weird :wink: but I gues it’ll do the trick when you do some testing. However, I think the camera is not really good enough in terms of resolution. Might lead to some problems when it comes to actually tracking the marker points. Maybe a better & uniform lighting will help here.
Keep me informed. Looks interesting though.


Wow i attempted to do facial tracking but it failed! Can you please post a tutorial when this is all done?


well, I can try:

I loaded the video into blender’s movie clip editor. Then I positioned a marker on each tracker point and tracked them one by one individually. That took a little bit of tweaking since the lighting of the video was not that good and some of the markers lost track from time to time. One further problem is, that the lower lip marker and the chin marker cannot be tracked cleanly when the mouth opens fast - then the markers cannot distinguish between the trackerpoint so I ended up tracking them by hand frame by frame.

Once I had all the markers following the tracker points, I applied the “Link empty to track” from the Reconstruction menu. Then I had an empty for each tracker point in the 3d view plus a camera object. I think I also used the “Setup tracking scene” function from the tool shelf in the movie clip editor on the lefthand side (I’m not 100% sure about this step).

So, now I had my empties which moved exactly like the tracker points. From here I connected the empties to the face controllers like this:
The head mesh is deformed by shapekeys. The shapekeys are controlled by controller bones via drivers. So, when I move a controller bone the corresponding driver changes the shapekey value and thus the mesh is deformed. This setup is pretty much standard in facerigs and has nothing to do with the empties which are controlled by the tracks of the markers.

Now I had to connect the empties to the controller bones, which is pretty easy: I setup a bone constraint for each bone. I used the copy location constraint to connect the empties to the bones: Each bone simply copies the location of the corresponding empty.

That’s pretty much it.

There is only one detail for the head controller which rotates the head as a whole: Here I used a simple IK constraint which corresponds to the empty on the forehead.

Here I give a screenshot showing some settings for the head tracking solution:

More questions? Ask! :slight_smile:

Best regards,

great! I’ll check it out when I have some free time, most likely tonight.

My current project doesn’t need facial tracking but I would love to do a successful test so I can have the method of setting it up committed to memory and I also have a file to refer to if I ever forget anything for future projects.

Actually, I just had a quick break and I read it over right now!

Now after reading it, what I attempted was really similar to what you did. The only difference was that the bones I parented to the empties were actually influencing the mesh, not influencing shape keys. I heard that it was possible to use bones to influence shape keys, but I kind of forgot about it until you mentioned it. And the problem was that I moved my head, which moved the tracks out of alignment, and I didn’t know how to fix it. I read on an earlier post that you worked around this somehow?

And could you enlighten me on the process of setting up bones to influence shape keys?

and one more thing:

“There is only one detail for the head controller which rotates the head as a whole: Here I used a simple IK constraint which corresponds to the empty on the forehead.”

You said that at the end of your tut, I don’t follow, what are you saying?

hi X3SB,

  1. I simply did not move the head. neither the real actor’s head nor the 3D head.

  2. shapekeys are quite simple to setup. just create a shapekey and then in edit-mode you can deform your mesh. going back to object mode you can blend between base mesh and deformed mesh with the shapekey slider.
    after you setup all your shapekeys you add drivers to them by clicking RMB on the shapekey and hit “add driver”. after that you should take a look at my last screenshot at the top left area.
    you can adjust the driver in the generator menu by altering the slope of the first order polynomial (which is the factor of x in the function expression). [you might find a tutorial on bonedriven shapekeys. david ward did some enlightening tuts on that particular topic!]

  3. the thing with the IK constraint is pretty much straight forward: in order to control the overall head movement (which is basically rotation around global z- and y-axes) I assigned an IK constraint to the head controller bone with the forehead tracker as the IK target. works pretty well.


Hi same News…

Not my Tracking helm but a test :smiley:

what fun! Christoph it is scary how much you are beginning to resemble your hedgehog :slight_smile: