facial expressions with Drivers (big picture)

These are some screenshots of blender whild making emotions with the drivers. If you haven’t tried them out yet, you have to! Download the .blend file I was working on here --> http://orange.blender.org/wp-content/themes/orange/images/blog/controller2.blend

example pic:

all the pics (8 in total) can be found here --> http://photobucket.com/albums/f47/maccam912/

lol that file was the bomb.

we had time to focus on work when we tested that new tool
because we got instant poses to laugh about. i had so many
tears in my eyes i couldnt even see anymore.

that technology addition in my opinion is a huge boost for blender
to use the animation system in it.

Damn that is cool! :o

Yeah this is kool,i have’nt really played with it yet but i’d like to know something.How diferent is it from the Shape Key Mod?
I did a facial expresion test on one of my models with the ShapeKey,but if it’s alot easyer with Widgets,can you explain how?

btw,thanx for the file.

Man I love you guys. Very top notch stuff.

Wow, that is really cool. Can anyone point me to how to use the drivers?

Thanks maccam.

I would also be interested if anyone knows of any tutorials on how to set that sort of thing up. Looks ridiculously cool!

Ok i found a thred on the subject and tutorial.

I experimented with both the scritp and the one compiled with 2.40 wich i think is the same one, anyway,I folowed all the instructions in the tut but when i hit FINISH the script crashes.Well the funny thing is i find the WidgetShapes in the 3D window and it woked perfectly.
Maybe i’m doing something wrong but i’l keep working with it…
Awesome script non the less…


This is really nice, it will help us to understand how to use drivers, however a tut would be a helpful complement to this great sample.

I feel I need to point this isn’t maccam912’s work.

This file was available on orange’s blog. It was done (to my knowledge) by bassam and just show the current feature of Blender. I’m gonna move this to news & chat since it’s best suited for this section

hehe, yes I did this file- It’s intended not just as a feature demo, but as a free posable character for people to practice animation with. I kinda flubbed publicizing that fact- but in case you’re not aware, feel free to use that blend/character for whatever tickles your fancy.
His name is “ManCandy” btw - named after the movie “Manchurian Candidate”
I’ve got several (aninimation) blend files sitting on my HD with this character- I was planning to release a little animation to introduce the character titled “dance of the mancandy” but my tardiness followed by project orange prevented me from finishing it.
Oh yeah, and the drivers are awesome- right now we’re using an “on face” setup where the controllers are bones that sit right on the face- it’s more manual in a way, but much more immediate and fun to control.

One thing… I was thinking of buying that book on expressions and lip-synch called Stop Staring. From reviews, it sounds like the book eventually steers you toward 40 blend shapes with a control rig (in Maya). In the FAQ he answers a question of why use “those circles and squares” instead of the “blendshape editor” and he responds that “With my setups you can control, in some cases, up to 6 shapes with one control, almost like being able to dynamically control 6 sliders at a time in the blendshape editor”.

For anyone that has the book, am I correct in thinking that this new feature of blender is similar to the rig in the book? That you could theoretically entirely duplicate his method using Blender instead of Maya?

I was thinking of buying the book since it is so highly recommended, and it’d certainly be cool if it was totally compatable with Blender (obviously with extra time taken to translate the Maya methods over to Blender).


Does anyone know of any projects to link the drivers to a midi controller? I am pretty sure I heard/saw something about other 3d apps (or maybe its a plugin for one of those apps…or even a seperate app) being able to set up drivers and then control their values with midi-sliders. That’d be very cool if doable. Wish my coding skills were up to that kind of snuff but I haven’t written a line of code in 2 years and even then I never coded anything particularly sophisticated.

MIDI control in Blender seems to be one of these features that has a hard time gaining momentum. I am very interested, but my Python programming skill is zero on a scale of 10 (read “it sucks”). I have studied this one lately:

With almost no documentation to help, I am lost.

Your idea would be a good application.

I haven;t looked at that script because honestly my blender/python knowledge is no where near sufficient. From the description it sounds like controlling animation through a sound or midi file (I may have misunderstood). What I was thinking of was a midi controller with say a number of sliders and the input from those sliders controls the value of a driver. So if you move a real world physical slider on a controller you can, say, make any eyebrow move up or down. This would make animating a bit more physical and immediate I would think.

Shawn Fumo,yes,this tecnique possible now in Blender is very similar to the jason osipa approach who wrote the stop staring book.Probably now in Blender is even easier and more elegant,the coders have made a very good work.

I made a blend file of suzzane with shapes keys for all the phonemes and drivers: http://blender.sixmonkeys.geek.nz/albums/album33/suzanne_speaksexpressions.blend