Re-Face! v1.2 - Facial Mocap Retargeting Tools Addon

I like how you guys are all answering each others’ questions before I have a chance to, lol! I’ve been monitoring this thread today, but haven’t had a chance to reply until now, so I’m glad these questions are getting answered before too late in the day :slight_smile:

K Horseman’s suggested jaw setup was what I used for the sample file that comes bundled with the addon. I have plans to create a mocap-friendly facial rig generator, so that this becomes something of a one-stop solution for facial mocap rigging and retargeting.

FWIW - maybe not so much in other applications, but in Blender, getting the facial rig constrained to tracking markers and constraining both of those systems to a head controller is a little tricky, and can easily result in cyclic dependencies that break the rig. The only way I’ve found to get around this is to either a) create a separate armature for the facial rig, and constrain the facial armature AND the markers to the head controller, or b) simply copy the animation directly from the tracking markers to the facial rig bones, which in that case can be part of the body armature.

This is why I’ve included the option to copy animation rather than use constraints when retargeting. It’s also why I’m planning to create a facial rig generator :slight_smile:

Just bought the app and I love it dude! I used your track & rig on a face model I have. I works great! This could be a long night. Going to track my own video and see how well this goes (have only tracked a few, gotta get back on the bicycle).

I was amazed at how well it worked. Great job!

Awesome! I’m glad you’re having good results so quickly.

News:
I’ve just finished adding a text search and replace for creating the markers / bones list, such that if a marker is named similarly to a bone, but the facial rig and markers have different naming conventions (like different prefixes, such as “TRK_[character name]brow.R" and “DEF-brow.R”), it’s possible to automatically select the correct bone by simply telling the addon to replace "TRK[character name]_” with “DEF-” when searching for a match.

Since I plan to use a consistent naming convention when I implement a facial rig generator, this kind of search and replace will make it easier to find matching bones for markers.

That will sure come in handy, renaming bones is the bane of my existence XD. I wish blender had a built-in prefix/suffix add, edit. But since it doesn’t, your update is very needed. Thanks :slight_smile:

Lol yeah I always do batch renaming in the Python console for that reason.

Actually, the search and replace doesn’t change the name of anything, it just allows you to have the bones and markers not named exactly the same, just similarly, when auto-filling the marker / bone list :slight_smile:

I also plan to add a bone renaming utility like the one that already exists for markers ( or just make it work for bones as well).

Today I’m working on extracting facial mocap from head motion.

Awesome! Keep up the great work! This is pretty sweet. As far as workflow goes, are you naming the tracked markers as you place them? After you’ve tracked them? Does it really matter?

Nah, it doesn’t REALLY matter, but I’m a stickler about naming things usually. It bothers me when my scenes have objects with names like “Plane.002”, “Plane.003”, etc. save yourself the headache and name them as you create them while tracking, that’s my advice :wink:

I’ve made a lot of progress toward extracting stable facial mocap from data with head motion, even in mocap data with substantial head movement (the performer moved their whole upper body, as well as their head, in the sample data I’m using).

So the next version of Re-Face! will include support for facial mocap stabilization! I will also be adding the face rig generator in the near future, so this will likely be in the next version as well.

Cool! I’d think when one does performance using facial expressions, it’d be quite unnatural to sit still and don’t move your head :slight_smile:

Would there be an option to not generate face rig at all, or to generate either game engine ready face rig (no “.” in the names, several complexity options like minimum bone count, medium and HD) or animation face rig for offline rendering ?

Awesome work! I’m excited to see where this goes, because it addresses something I asked around the community a few months back and got no answers to. I’m interested in multi camera implementations of this for getting accurate depth movement of tracking as well… If not through the kinect, through 3 or 4 source videos. Would the head have to be isolated in movement for an accurate track or do you have a way to track out rotation on the neck as well and determine source input?

Sounds impressive!

A face rig generator would be very needed, all my skills revolve around rigging character bodies and animating full body movement, I’ve barely dabbled with facial rigs so this would be very needed.

Yep, the face rig generator will be a separate utility altogether; the idea will be to use the addon first to create a mocap-friendly facial rig, if your character doesn’t already have one, which you can then use to retarget your mocap data using the normal Re-Face! workflow. I’ll plan to provide a few template faces of varying complexity.

At the moment, I’m only working on stabilizing facial motion from a single source. I plan to support multiple sources, but I’m not entirely sure at the moment what the plan of action is. I have ideas, but I’m open to suggestions, since I want it to be a generally comfortable workflow :slight_smile:

excellent. Have you considered contacting ACCAD’s motion capture department up here in Columbus? It’s a ways out but people there might have some ideas/be interested in experimenting with this/getting this running. I’m going to meet up with someone from over there tomorrow anyway, I’ll see if they’ve got ideas.

I hadnt! Frankly, I didn’t even know that existed :wink: I’m still fairly new to the area. Very interesting, please do let me know if it comes up in your conversation. :slight_smile:

I’ve been working on a video project for the last few weeks, but I’ve gotten some time today to go ahead and test out facial data stabilization in Re-Face! and it’s working pretty well! Using some free .c3d samples I found on the internet that include some drastic head movements, I’m able to extract very usable local facial motion.

I’ll attempt to make a teaser video tonight, showing my progress. Next up: face rig generator!

looking forward to it :slight_smile:

I have a question about how it handles eye movement (if at all) and what happens when the subject blinks, can blinks be translated? And if not what can be done on the video recording end to help?

(edit: I’m the only one who hit the like button on your first video??? I think you should enable comments, disabling comments might be what people aren’t liking. Many people use comments to help each other troubleshoot, and share experiences with the addon.)

I haven’t actually thought much about eye tracking yet; I haven’t found any data that includes eye data (other than for eyelids, which is really easy to track). So at the moment there’s nothing specific to address eye movement in the addon. Of course, I imagine if one tracked their eyes similarly to the rest of their face, it may be as simple as moving the markers to the tip of the eye bones rather than the base, and constraining the bone with a Damped Track constraint (?). So I’ll look into that, but first I’ll need to find some data that includes eye movement :slight_smile:

(edit: I’m the only one who hit the like button on your first video??? I think you should enable comments, disabling comments might be what people aren’t liking. Many people use comments to help each other troubleshoot, and share experiences with the addon.)

Yeaaaahh, I suppose I should enable comments. I really have a distaste for the comment culture on YouTube in general, it can be such a cesspool of spite; that’s the main reason I had them disabled by default. I also didn’t want anybody to think that I’d be able to offer any kind of product support there. But I’ll go ahead and enable them.

As long as you leave a note in the description (or an annotation too) that tells people where to go to ask for support, then it should be fine.

I’ve found blender related videos on youtube to be one of the few places on youtube where discussions are positive and helpful, I hardly see any arguments or flame wars when blender (or an addon for it) is the topic. But as far as every other video, yeah, trolls galore :lol.

Sorry for the delay in posting this! After some tests with jittery facial motion capture samples, I realized it would be beneficial to spend a few days researching and implementing an f-curve smoothing tool!

So here’s a teaser for v1.1, which will be on its way to the Blender Market soon: