Facial animation with Faceshift and Blender

I’d like to share my test results achieved with motion capture software - faceshift that is currently in open beta. This facial performance was captured with one Microsoft’s Kinect sensor without markers and exported into blender (bvh – export).

Render made with blender cycles, hair and fur – blender internal.
Also Adobe Photoshop, Adobe After Effects and Adobe Premiere Pro CS 6 were used.
Critique please!

https://vimeo.com/52138236

Critique what? there isn’t a link or embedded presentation. Unless I’m missing something

Extremely Uncanny Valley. Definitely needs polish and finesse to remove the mechanical look and modulate the weird stuff like strings of eye blinks.

Although I am intrigued, I can’t see where your facial performance is showing up. I don’t see a link or attachment. Is it hidden somehow and my firefox browser isn’t picking it up?

I was thinking the exact same thing. Uncanny Valley.

Very promising. Can you share with us your workflow for this? :slight_smile:

For those of you, who couldn’t see a link:
on youtube - http://youtu.be/fjQxzaBDosE;
on vimeo - https://vimeo.com/52138236
Thank you.

Not bad at all… but I noticed right away that the tongue does not move. Will you have to animate that by hand?

Very impressive for simply using a Kinnect. Minimalistic equipment but good results for the mouth movements. Could be an awesome time saver.

Yes, impressive for such a simple setup.

But: “uncanny valley” was my first thought as well.

How did you rig the face?
If it were body Mocap, I’d say you immediately used the BVH importer skeleton (which you should not!!!), because there seems to be no mechanism in your rig, which prevents impossible deformation/movement - especially noticeable in the mouth/jaw area (a mouth shouting “Aoooo” that way, opened much too wide, jaw completely dislocated, really looks creepy, like Imotep exhaling swarms of flies in “The Mummy”).

I used mainly BVH-importer skeleton, actually. I just added some additional bones (orange bones in the image below) and just one corrective sape key for mouth opened. I’m not a big fan of retargeting system from Benjy Cook’s Motion Capture Tools, because I have often problems while combining body and facial BVH. That’s why I prefer to keep my rigs as simple as possible. I agree with you that her mouth opens too wide and that jaw is dislocated. Thank you. I’ll try to solve this problem.

Attachments


I like it :slight_smile:

I am new to Blender and animation. Currently exploring two different animation methods; FaceFX vs. Faceshift. You mentioned also using Adobe products in your workflow. If you don’t mind sharing, I would like to learn how to do this.

I currently have a Project Pinocchio character with a high res skeleton, and high res poly. I would like to learn how to do realistic skin, hair, cloth and animation in Blender. For animation, I’m thinking Faceshift, export to bvh, to use in Blender and body mocap from truebones.com. Ultimately, I’m trying to get an animated rig; body and face, that looks realistic, into Unity Pro. I have a lot to learn and if you guys recommend a related training program, I’m all for it. Thanks!