Maya tutorials in blender

I am using the book Stop Staring Facial Modeling and Animation Done Right by Jason Osipa. The theory and logic is general purpose, but the tutorials are all done in Maya. I am not familiar enough with the blender rigging system or key framing to be able to easily duplicate these tutorials.

Has anyone else worked through this book? Or does anyone know a good reference that will help me take maya tutorials and use them in blender?

There are some similarities. It’s not exact though. In general…
Maya: Blend Shapes -> Blender: Shape Keys
Maya: Set Driven Keys -> Blender: IPO Drivers
Maya: Channel Box -> Blender: ??? :frowning:

That’d be the N-key properties box in Blender. Not as powerful as the Maya one (neither does Blender handle a history log like Maya, instead using a more 3DSMax-like approach of adopting a modifier stack) but has essential transform properties, and does have ability to expand (e.g. when going into Sculpt mode).

I’m not sure of the other one, though “Stop Staring” is very applicable to Blender.

thanks for the info.

There is also a question I have about the first little tutorial, the mouth animation. I have built the mouth, I figured out the shape key part, but now there is a section where locators are used. I am assuming that the Blender empty would be the correlation I am looking for. But there is a section where he is applying the empty to the mouth and using what seems like constraints for translation along the x and y axis.
Once in place I should be able to move the empty and affect the shape of the mouth based on the shape keys and the constraints.

Do I have that right? and if so, where can I find a good tutorial on how to set that up properly?

Got a chapter / page number?

yeah,chapter 1 pages 15-19 sections are
Creating a Sync Tool 1: Shapes (which I believe I have figured out with shape keys made)
for the above section I have the flattened shape as the basis shape key and the other two are named and created like the book. I used a circle and then added a subsurf to get the roundness of the nurbs.
Creating a Sync Tool 2: Setup (which is really stumping me)
This is where a locator is used to animate the mouth. has me stumped. Don’t know where to start to be most effective.


Ah yes… in Maya, when you make blendshapes (what Blender calls Shapekeys) you get this really nice automates box of slider bars for every blendshape. Basically, where Blender has you sculpting different expressions onto one “basis” head, in Maya you make lots of different heads (hide them in different layers), select all the heads (the basis one chosen last kind of thing) and when you run the “make blendshapes” script… voila, you have slider bars correctly named to morph your basis head into any combination of the other expressions.

…this is kind of similar to the way the Action editor in Blender (when in “shapekey” mode) shows lots of sliders for each expressions - they are the same deal.

Where you are going to hit a little snag here though, is in the very cool “expression editor” which maya has. It’s very easy to make a slider bar from objects in 3D (e.g. have a cube on a line) and use the Maya expression editor to make the movement of your sliderbar control (the cube) along Z axis, result in a proportionate amount of any given blendshape.

I know that’s a mouthful so here it is again in plainspeak…

Those instructions are kind of step-by-step Maya specific. To learn the expression editor trick in Blender, it’s called “driven shapekeys” and can be found here:

righteous man, thanks

Hope you’re still around…

The concept of those “driven shape keys” is probably most easily understood after you see a live example. The attached file has two shapekeys; smile & frown, which are controlled by the model slider I have next to them… just drag the circle up and down.

This is typical of the kind of facial rig setup used in “Stop Staring”; basically different expressions are used additively to suit any given situation, including phoneme mouth shapes for lip syncing.

Animation can be fairly daunting, even moreso when you’re following step by step instructions of a different application. Keep persisting though, because Blender can animate well.

Good luck.


smilefrown_ctrl.blend (222 KB)

Thanks for that file. Not sure I can duplicate it, as far as the smile frown box is concerned. It does not look like I can select it to edit it, so maybe it is just an image. I see that you used the ipo editor in object mode and not in pose mode as the tutorial link suggested. I think I can get a handle on that. and I am assuming that I can use an armature as the object in question, or any blender object for that matter including empties.

Now the question I have is about the presence of only an x and y axis in the ipo curve. Is that relative to the ipo curve window, or is are they the only axes affected by the driven shape key method? That makes sense I hope. Assuming that I need to rotate my model, say a man tumbling through the sky, no matter his orientation the driven shape keys are going to work, yes?

Also I am thinking that I should be able to use that slider to control both the Open/Close movement and the Wide/Narrow movement. So having the slider applied to Loc Z on one shape and Loc X on the other. Is that going to produce the same result as in the book, one slider that controls both movement types. Or am I really going to have to make two sliders and then make them children of another object or something like that.

I know that I can experiment and find some of this out on my own, but I don’t want to learn any bad habits.

Thanks again for the help.

The control slider is not an image… I disabled mouse selection of the shapes in the outliner so you would not accidentaly move them around. Start up an “outliner” window and follow head => big_control=> (etc) to reenable mouse selection (click the mouse arrows on the right side of the outliner window)

Object mode vs pose mode: they would have used pose mode because they were using armature bones which need pose mode to move, but anything which can be measured in terms of position can become the “driver” for other things.

The X and Y azis on the IPO window control the influence of the slider against the each blend shape. Select the head in the 3D view, then choose either “smile” or “frown” on the right of the IPO window, then hit the “HOME” key (with mouse over the IPO window) to zoom in for a good look. The “basis” shape is always 0, which goes up to 1 when a blend shape is on. So, if a blend shape is half way, then it would be 0.5. The Y axis (left side) is this blend shape and the X axis across the bottom reflects, in my example, the Z axis of the slider bar (how far up and down it has been moved.

So… in the example I made you, the slider bar position is “rested” at 0, but can move as high as +4 or as far down as -4. If you look at the “smile” IPO line, it ranges from Z=0 with blendshape=0/off through to z=4 with blendshape=1/on. You see? Move the slide controller from 0 up to 4 and the “smile” blendshape changes its influence from 0 to 1. On the “frown” curve, the Z position of the slider goes down to -4.

To be honest, it’s fairly tricky at times and I had a couple of guesses as to which way was X and which was Y as I was making you this example. In Maya the “expression editor” is much easier. You can just say “blendshape = slider.posZ / 4 (but only when slider.posZ>0)”. This is also true in Blender (python expressions) but Maya’s is simpler to use and can have much better expressions.

Have separate sliders: one for smile/frown, another for wide / narrow mouth, another one for open/close jaw (or just rig the jaw to a rotating jawbone). This way, you can mix the sliders… have it smiling with narrow mouth as well as frowning with narrow mouth.

The slider indirection is local by the way, meaning you can make the head do cartwheels and it won’t put the slider off. Use the middlemouse wheel to rotate the view for better angles to look at the head an slider.

Don’t worry about asking questions. You’re attempting something fairly complex here. Hopefully I’m not babbling (a few too many late nights).

Ok, I think I was able to fudge it. At least it is doing what I want. I thought about your explanation of using three different controls, one for open/close, one for wide/Narrow, one for smile frown, and that may be the way I go in the end, but at this point I am trying to stay as close to what the Jason Osipa is trying to convey as possible. So here is what I have.

The cube in the control. I has been locked on all three axes with a constraint. y is locked out as 0, 0. x is locked in a range and z is locked in a range. moving on the x axis controls the mouth Wide/Narrow, while moving on the Z axis control Open/Close. So moving the cube within the constrained box (x/z axes) with open/close-wide/Narrow the mouth.

any who it seems to work. The only thing I see that is strange is when the setup is in the rest position the mouth turns upside down.

So we will see what you think… I am sure that I can add a smile frown control to this one and get the same effect as using 3 controls.

Thanks again for the help…


basicmouth_control.blend (130 KB)

Looking good.

Combining the controls into one just like you have is the way to go. For some reason I misread your explanation (tired because of a string of late nights) and thought you meant one control as in X axis for both actions, meaning you would not be able to then combine, as your controls would range from X=0 narrow & closed through to X=1 wide & open. Ospia is working up to something big here (look at p.278-279 “the complex setup”) and you are definitely on track.

When I was looking at your IPO curves, I thought at first that you had four points to make up the diagonal line, and was going to say this was unnecessary. But closer observation shows there are only two points, but you have them as smooth curves (bezier) which means they have “handles”. For this simple movement, I would use the IPO window menus and follow Point >> Interpolation Mode >> Linear so these curve handles disappear and won’t later be mistaken as points causing wonky lines if they are ever moved around.

The mouth flipping is caused because you have a blendshape (shapekey) of the bottom lip moving downwards, linked to the cube movement going upwards. Ospia has the relationship as “MouthShapes.OpenClosed = -MouthControl.translateY” where the last one is negative. There are a couple of ways to fix this… basically you need the diagonal line in the IPO editor to go downhill, and you will then need to change the “limit location” constraint to fit in with the new rules.

As the next step, I usually make a rectangle around the controller as a separate object, showing the limits it can move (kind of like a visual window of the area). If I then parent the controller to this window object (CTRL+P) its movement becomes relevant to the location, size etc of the parent border. This means I can resize the border and place it in a much more comfortable position for the animator to use. Oh… and the “limit location” of the controller cube should be set to “local space” not “world space” for the final “repositionable border”, otherwise it doesn’t inherit the parent border positioning limits but takes them from global “real world” space.

You’ve obviously got the principle down pretty good, especially in that you have been actively translating the Maya centric instructions to Blender as you go. :slight_smile: Huge effort on your part.


basicmouth_control02.blend (35.8 KB)