Re-Face! v1.2 - Facial Mocap Retargeting Tools Addon

Looking good so far! :slight_smile:

About the plans for a face rig generator: a facial rig I like is PitchiPoy’s version of riggify.

Have you use this modified version, and would it be compatible with Re-Face?

" We invite you to explore our site to learn more about who we are, what we do and how we can assist you and your loved ones. You will find a wealth of valuable information here, including videos and blog posts about topics that, over the years, many of our clients have asked about at our initial meetings. You can also read testimonials from satisfied clients, learn about upcoming events and request a copy of our informative and fun newsletter.

Of course, the best way to get to know us and how we can be of assistance to you is by scheduling a consultation. We welcome the opportunity to meet with you in person to discuss your particular concerns and goals. Let’s talk soon."

probate estate planning california
estate planning attorney orange county

I’ve heard of the Pitchipoy fork but never used it - I think I will from now on! :slight_smile: I’m actually not sure how well it’d work without some minor modifications. That rig is constructed to automate secondary deformations with few controls; so if one only tracks a few markers, it’d be a good solution. But if you track a lot of points, it seems to me that the automation isn’t as necessary because you get the secondary deformations for free just by having bones constrained to the extra markers.

Ive used khuuyj’s Face Bone Tool with mocap and it works fairly well. It looks a little similar, so this face rig looks like it could work just fine with mocap.

But I’m still experimenting :slight_smile: I think I’ll include at least two rigs, one for only a few markers - with more automated deformations- and one for a larger set of markers.

That rig looks confusing least to say. Why make rig look like facial structure when head of the bone (or tail ?) doesn’t do any deformations ? It’s confusing to animators least to say.

Some (maybe most, I’m guessing? I haven’t checked it out) of those are probably stretchy bones, in which case it would make sense to have them extend from one bone to another. :slight_smile:

I see what you mean. In that case such rig would be incompatible with some of the game engines.

Re-Face! version 1.1 is available on the Blender Market!

New features include:
-Facial stabilization tools - this is the biggest update to the system
-F-curve smoothing operator to remove jitter
-Lots of little option additions for various tools
-Bug fixes / UI clean up

Version 1.2 will include the facial rig generator. At that point, I plan to increase the “purchase” price (with all the heated debate about the Blender Market, I’ll simply call it the “requested donation” amount) to $19.95, where it will remain for good.

@ohsnapitsjoel Very nice work!
My suggestions is you can try to add control value. I have some experiences work with motion capture data and I think retargeting 1 by 1 disint best idea. It will be nice to have control for part on face.( brow, eyes, mouth and check).

If you looking some inspirations, check Judd Simantov work(Face Rig) and maybe FACS also can help you.

Good luck!!

I’m not sure I understood your post. Can you elaborate?

@ohsnapitsjoel My English it’s not perfect, but. I try to explain, how I see retargeting process from mocap data to facial animation and how it’s work for me good facial rig. I think some example can help you. In my pipeline, copy mocap data to bone is not enough to finally have a natural/realistic animation. Rig needs tool to more control after you done retargeting process.

Thank you for chiming into the discussion. :slight_smile: No worries, I don’t think I was having trouble understanding your English. :slight_smile: It was something else; I believe I understand what you’re saying, but I’m still not clear about what you’re suggesting as a solution to this problem?

Especially in the case of a track using Blender’s tracking tools with a camera close to the actor, there’s virtually no depth data. In addition, the markers are often placed a little differently than the bones of the facial rig, or the character’s head may have completely different proportions than the real human performing the action, so there are little differences between how much the markers are moving and how much the bones are supposed to be moving. Are these the things you’re referring to?

Or are you referring to automating secondary motion? I’ve studied facial rigging quite a bit, and I could make a robust facial rig intended for animating by hand, including automated secondary motion. But I haven’t found those techniques to be especially helpful when creating a rig for motion capture; have you found traditional facial rigging techniques to be useful for rigs designed for mocap?

I hope I haven’t been misleading with the last video I posted. The intention was only to show the workflow for stabilizing facial motion, not to present an unpolished animation as a finished product :wink:

Ok. I will try to explain what I mean. I’m sorry, but I do not tested your addon, and I only can talk about your video.

  1. To me best CG facial animation get from morph(blendshapes). Special the micro expression.
  2. The bone rig can be really good, and my favorite of this is Judd Simantov. He show me and try to proof is the manny reason, that the bones can be better then morph.
    Did you try use mocap data for transfer to rig control slider, not to bones? That can be solve your problem with 2d data form one camera.
    I have some experience with mocap data and animation keyframe with Faceware rig, iAnimate rig and also use my own rig in mocap studio(maya, motionbuilder).
    I lot of study FACE(Facial Acting Coding System) and this can help me how it works microexpression what is very important for natural face movement.
    One big problem with mocap data is editing, to manny keys and will be good to have tools for editing.
    You made a great work, and I do don’t want do a critique, but give some idea/inspirations.

If you want I can share my experience. Send me priv message with your address email, and I give you more feedback.
Regards,

Hey, everyone who’s already acquired Re-Face! I would love some feedback on your experiences with it. It’s been very (actually, extremely) quiet with regard to user feedback, so I’m not sure if I’m going in the right direction with developments, or UI changes, etc. Of course, I like the changes I’m making and the features I’m adding, but if no one else does, then why spend the time on those changes / features? :slight_smile:

Currently in development are:
-Some graph editor tools to create drivers for facial shape keys very quickly
-The facial rig generator (I could sure use a beta tester or two for this one :))

I got swamped with day job, have had much time lately :frowning:

By the way, can you recommend a tutorial on tracking ?

Also, what’s the best way to mark my face so that marks wash off easily ? :slight_smile:

Sebastian Koenig’s “Track, Match, Blend!” has a lot of helpful tips about tracking. The video about one point tracking is especially useful here. A quick YouTube search turned this one up: https://www.youtube.com/watch?v=Tv7E0pBt8FU . You’ll want to solve the camera motion as “Tripod”.

I had been using a makeup pen for my initial tests, but I’ve been looking into DIY hemispherical markers with skin adhesives. There are some cost-effective options for DIY-ers. It doesn’t make much sense for me to invest in professional mocap markers when I’m only using mocap for personal projects :slight_smile:

Hey Joel, quick update, I just wanted to let you know I went to OSU’s ACCAD mocap lab the other day and talked to several people, as well as got to play with a pretty sophisticated setup over there. Unfortunately it seems they mostly work with Motion Builder and pre existing tools and hardware now that the industry has developed plenty of options and OSU has a seemingly bottomless budget… So no one really seemed to have much idea over the multi video tracking setups and averaging the feeds to remove head movement (╯°Д°)╯︵/(.□ . ). I’ve wondered about attaching a dual or tri gopro setup mounted to the head from front 3/4 and side to remove any extraneous movement and make stabilization minimal. I’ll let you know if I run in to anyone else who might have some ideas. Otherwise, been playing with the tool, it’s awesome! If I get some free time here soon, I’ll try to post some results.

There is of course f-curve smoothing addon in blender allready, is your algorithm different?

The algorithm probably isn’t that different, it’s just a moving average filter. But mine offers more control over the amount of smoothing. The native smoothing tool’s effects were too drastic for me :slight_smile:

Quick update: I’m working on some tutorials to demonstrate Re-Face!'s facial rigging tools, shape key driver setup tools, and working with Re-Face! In general. They should be done in the next few days!

Hello all! I’ll be dropping version 1.2 in the next few days, so here are a few tutorials and teasers about using the new tools in version 1.2. I haven’t had much community feedback regarding the workflows and such, so I’m kinda flying by the seat of my pants here :). I’m very much open to community input regarding the tools and workflows, so let me know what your experiences are and how you feel I can make it more feature-complete.

Rigging With Re-Face!

Retargeting With Re-Face!

Driver Creation With Re-Face!