This is my first post and I must confese that I’m an absolute beginner, willing to learn.

My goal is to do a lipsync.

To this end I generated a model with makehuman, and then used the Mycap Studio sample Prometheus bvh capture

I managed to make the MH model move some facial bones, but it’s so far from reality…
the way I did it:
-imported the MH model (mhx file).
-imported the bvh file
-moved the model as close as possible to the bvh markers

  • scaled the bvh (reduced it)
  • selected each bone of the face (e.g. lower lip mid) and did a copy location to the bvh marker for that position (lower lip mid)

But now the model looks strange, the lower lip is too much streched, it looks wired…

I would appreciate it if you could provide some advices on what I’m doing wrong…
I’ve uploaded the blender file that I’m working on here

Thank you


I wonder how it works with a bvh captured from a real mode, therefore with particular distances between the chin and the mouth, or the mouth and the top of the head, and a model that does not have the exact same proportions.

Even though a scaled the whole captured bvh to match as much as possible the size of the model, is it needed to find these distances between the markers of the bvh and try to match the bones to it?

Thank you