Face capture test

Helmet test:
Hello again, guys, I am working on a helmet with a camera for face motion capture. I show you the results of the test with the blender tracking algorithm, I hope you enjoy it.


Final test
It is an eleven second animation with Sintel characters that apply face capture tecniques and animation with face rig prototype v2.1.

I was wondering whether or not producing a “Making of” video, please, comment if you would like it.

This is the Making of video, it shows how the magic works.

hello,
I’m testing a rig interface I came up for facial motion capture (with Sintel from Durian Project).

I designed a special rig for Sintel’s face that works with sliders, drivers and constraints. For motion capture I used the blender 2.65 internal tracker. If you are interested I could share the .blend with video sample and face rig interface.

I should prepare a tutorial because the interface requires accurate calibration and motion capture to give interesting results.

http://s18.postimage.org/ojjtobk05/Capture01.jpghttp://s15.postimage.org/9k8vti1s7/Capture02.jpg

I’d love to know opinions or suggestions about the test.

See you around!

who ask for a TUTORIAL? :stuck_out_tongue:

as I promised I give some of my time to explain a little bit how this whole thing works. Let me know your opinions and suggestions about the turtorial, please.
Enjoy!

SINTEL facerig v2.1

Face compensation test:

Final test is comming! :slight_smile:

An 11 seconds animation with Sintel Facerig Prototype.

Download audio here. (It belongs to 11second.com)

Comments please!


I don’t know how you did that, but it looks like you’re a genius.

Very cool! I was toying with this idea in my head because it seems like a logical thing to do with a feature tracker integrated in Blender :slight_smile: Awesome to see you implented this so well!

That is very cool - I would love to see a tutorial.

Thank guys, I am glad you like it, I took me about 4 days to build the whole thing. I think I could prepare a tutorial, but what would it be about? (calibration, explanation of the how it work, face animation?) this would be a first time tutorial.

I forgot to share the .blend file with the interface prototype
Have Fun! :slight_smile:

https://www.dropbox.com/s/al39tgeppa0jcv6/BlenderGuru_Sintel_FaceInterfase_v1.5.blend?m

no video included

All of it? :smiley:

jajaja a short list of topics would be much more appreciated :smiley:

Are there retargeting issues, male face to female face? I notice that the Real face is quite expressive and seems easy to capture, can the lips perform realistic speaking shapes or is it better at creating general expressions?

Great work.

Thank 3points, you seem to notice the complexity of human face animation, to answer your questions: It has re targeting, well the face-rig has the capability for resetting the input points location (constraints), you also can set the range of movement of each control(drivers) in order to adapt the track of the real face to the controls current shape. I thought It would be more impressive to turn from male to female face animation.

Finally, I think that it is better to create general expressions by now, the lipsync tests results too noisy and confusing. I’m working on that this week, and on the eye-look tracking issue. After that I should prepare the tutorial.

gratitude for opinions and suggestions, guys :slight_smile:

I would be interested in the things that stopped you and made you think about. Or was it just think and do and all worked as you thought would be working?

Can you modify the status of the video in vimeo to make it downloadable?

well actually I had a lot of problems building the control for the jaw and the eyelids, because drivers affect rotation. The most dificult thing is to compensation of head movement in the tracking stance doing that in blender requires a python solution I think, because drivers are too slow for that, another thing that is a problem now, is that is virtually impossible to keyframe the face rig with the “bake action” tool, still can’t figure how to get that works.

I set the video as downloadable, what would you like to do with it? Bao2

UPDATE face rig v 1.9
https://www.dropbox.com/s/y5rv0blkuhshjy7/BlenderGuru_Sintel_FaceInterfase_v1.9.blend?m

That is cool…maybe you can share tutorials about it (how to make that rig+how to put those tracking in to the rig) :slight_smile:

thanks in advance

Is easier to watch it without having to wait for it to download and also to be stored in a folder with a link to this page and your blend.

I tried something like this earlier as well, though your system looks much more refined and complete. The way I did it was a bit hacky.

COOL! :smiley:

That was actually the kind of test that made me start with all this, the problem is input data, there is no interpretation/calibration, the movement data just go directly from track points, empty to the armature itself. What I did, was to develope an interface control rig for the shapekeys of sintel face that can interprete/interpolate the input over 2d controls, even uses multipliers to calibrate the resultant gesture.

The most dificult thing up to now, is to get a proper video file with the head movement compensated, that thing could be done in blender, too. But I still can’t figure it out how with accurate results.

I promised I’ll prepare a tutorial guys, I’ll try to do something tonight

who ask for a TUTORIAL? :stuck_out_tongue:

as I promised I give some of my time to explain a little bit how this whole thing works. Let me know your opinions and suggestions about the turtorial, please.
Enjoy!

SINTEL facerig v2.1

Great Tutorials. Thanks

Is posible to download it?
Anda, porfis Emilio.

jaja of course, there u have