Motion Tracking problems(facial motion tracking)


I have recently tried out blender’s Motion Tracking tools and specifically for facial capture. While my results are quite good and useful I was wondering if anyone could help me improve it. Ok so I will start with what I have done.

  1. Recorded a short clip where I have tracking markers on my face.
  2. No head mount but camera stationary along with NO HEAD movement beyond facial expressions.
  3. Tracked all the points and did camera solve. Now I wanted this to be attached to a characters face so using the “Setup Tracking Scene” did not give best results so used “LINK EMPTY TO TRACK” this gave me a facial movement on 2 axis’s only but was perfect for what I wanted to do.
  4. Created bones for every point relevant to TRACKING MARKER and attached to Track marker empties using BONE CONSTRAINTS then applied bones to geometry with automatic weights.
  5. Played animation and it looks great expect for the fact that the characters proportions are larger that mine.(cartoon style character) The results are that some expressions even when exaggerated display small and not quite in line with the character.

Now I have set up SHAPE KEYS with the character before but I was unable to get them to express properly with the motion tracking. I used the BONES as DRIVERS and while it works the results are also not great. Reason for this is that the TRACKING MARKERS move around at different points, resulting in inconsistent DRIVING sums. Tried to do some variables but just could not get it to work. This results in some horrible head aches and might as well just do everything by hand then.

Am I missing something? This was really my first time using motion tracking so if there are any other steps that I can follow please help me so I can improve.

Though this sounds crazy even when I type it what I believe I require would be something like a SHAPE KEY calculator that would calculate the highest driving point for each motion tracker and then automatically make that the 1 and the lowest point make that the 0. If there is such a thing please help me find it.


Maybe there is a way that I haven’t tried where I can increase the BONES movement distance? BONE CONSTRAINT INFLUENCE does not go past 1.

So thank you if you can help. Below are 3 pictures. 1 is base pose the others are some expressions.


I’m not an expert on motion tracking, but if you’re using bones to change the location of vertices, wouldn’t this involve weight painting? It would be good to get a screenshot of the model with the armature visible in weight paint mode, so we can see the distribution.

Thank you for the reply. So we ended up building our own head mount camera face rig for stability with the facial capture and going to do a separate recording for recording head and body movement.

We will be using shape keys to set up the characters facial animation and then drive it with the markers captured using our headmount. The programmer on our side is going to try create a script that will record your face at default pose(no movement) and then calculate the markers movements on the Z and X axis and create a value between 0 and 1 for each movement then create the drivers for the shape keys. This could be done without a script but then you would have to redo your shape key influence each time you record a new capture which we want to avoid.

All and all this would save a lot of time and help speed up animating our characters.

I tried buying a Head mount but they are really expensive. So we bought a Cricket helmet, 4 dow sticks and a PS3 eye camera. The results are much better than expected and it was really cheap if you don’t mind some DIY.