Ludwig: Fully Rigged Character + Walk Cycle Tutorial

If I can ask for one small thing.

A way to make his lips whistle or make it look like he is blowing. Like if you were saying the word “too”

Sketchy, LGM…I am using ludwig to learn the basics of animation. Thanks for making him so cool. I ordered a book called “Animatiors Survival Guide”, I am waiting for the book store to get it in. I already made a crappy animation with ludwig. Made me laugh, then I deleted it. hehe. I hope to be able to get some simple back ground characters rigged for the coffee shop.

I have been using ludwig to learn about animation and rigging and I made an animation but it aint rendered yet its when ludwigs go bad and he has a knife and attacks the cameraman lol

Hello,

Great model! Thank you!

For lipsynchronisation: http://blenderlipsynchro.blogspot.com/

The blog is not out of date, i simply wait the 2.42 for delvier my new version of the tool!

Cheers,

Dienben

Well, I tried this and it didn’t work :(. I think the next best way of automating lipsync would be to extend the Papagayo script to be able to assemble different poses (stored as actions) in the NLA editor. Dienbien, what do you think of this idea? Is it possible?

BrianH: Thanks for the feedback. I see we share interests in more areas than just fur :D. Regarding the floor constraint; I agree with LGM. It is not necessary, and to be honest I don’t really like using it. I know alot of people do, so I’m glad that blender has it, but you probably won’t see it on one of my rigs (of course, as LGM will tell you, I’m prone to change my mind about features very often).

The toe suggestion is a good one though. Actually, I didn’t think that the kind of setup was possible. But I was thinking about it, and it actually requires only 1 constraint to be added per foot. I’m going to play with this for awhile and see if I actually like it better, but I think that the next version will most likely have auto heal lifting added to it. But like LGM said, you already have all of the tools you need; it just requires animating 1 extra control.

LGM: Agree with you on all points (I should hope by now that we agree on all functionality issues ;)). Do continue work on your walkcycle tutorial! I think that would help alot of people get started. By the way, check out the new topic for the animation challenge. I think you’ll like it.

Enriq766: You can already make a “oo” shape. Look at the control named “MouthPucker”. Unfortunately to do whistling you need to be able to expand and deflate the cheeks. You are really going to enjoy animator’s survival guide. It is possibly the best book ever. :smiley:

anthony: Great! can’t wait to see the animation.

dienben: Great script! Thanks for making papagayo a viable tool for blender. I hope that a solution can be found to make your script work with bones as well as shapes. Shapes alone can be limiting.

I think this .blend should be a sticky post on the forum like your static fur library. I also think BrianH’s fur should be on that previous thread.

I want to thank you, sketchy, aswell.
Already had loads of fun playing with the fellah. ^^

though kinda OT: Does anyone know what hardware affects the speed of playback/realtime-feedback the most in blender? I got my Graphics card toasted a while a go and had to fall back to an old ATI. (>_<’)
And noticed an annoying drop in responsiveness while modeling.
Would this also cause the slowness in animation, when the mesh layer is visible(no subdivision)? dunno… deforming n stuff sounds more like a CPU job… since my asking.

anyway, again nice job on the rig. :slight_smile:

Hello,

I think the next best way of automating lipsync would be to extend the Papagayo script to be able to assemble different poses (stored as actions) in the NLA editor. Dienbien, what do you think of this idea? Is it possible?

I thing it’s a very good idea, and i will try to do it!

Next release of BlenderLipSynchro will include support for custom phonems. Basically, you could write you own phonems’set and map it with the export of Papagayo (if you use the same phonems’set in Papagayo). I will experiment your idea too.

Is anyone knows if the orange team use the script to make the lipsync for elephant dream?

Another question, and idea: Papagayo is open source and was written in Python. If we need more functionalities, we could perhaps modify it?

To finish, Just a little word, about automatisation. I think the purpose of a script like Blenderlipsynchro is to help the animator. When all the phonems are imported, the animator must modify the animation to give life.

Again, thanks for your support for blenderlipsynchro and congratulation for your character!

Cheers,

Dienben

Sketchy,

I haven’t gotten to the lip-sync thing yet … off on another tangent :wink:

Is there something in this rig that would prevent it from executing actions in the game engine?

I just tried it, and nothing happens when I try to trigger an action.

As a test, I created a seperate armature in the same file/scene, (just two bones), and its action does work.

Mike

:o wow… wonder why it aint a sticky?!! or is it and i just didn’t notice %| oh yeah and allso, sketchy, thaks alot! :wink:

halley: :smiley: actually I think that my fur thread is way overdue for becoming unstickied. But until then I should add a link to Brian’s work in that thread.

the_nr: Glad you are enjoying the rig. As for your slow playback problems, could be either the graphics card or the processor or a combination of the two. But since you noticed the slowdown when you put in the ATI card, then probably the graphics card is to blame. Have you seen the low poly ludwig maquette on layer 3? Maybe you can get realtime playback with it.

dienben: Sounds great. Let me know if you need any help with testing or if you need any changes made to the rig to accomodate your script.

mstram: You seem to have a knack for finding things that my rig won’t do :D. So, a couple of problems here: First of all, constraints don’t work in the game engine. This is not a big deal, because you can bake them. I haven’t tried it, but in theory you should be able to animate and bake body actions for the game engine. The face is a different story. It uses IPO drivers rather than constraints. I don’t know if these work in the game engine, or if they are bakeable. My gut instinct tells me “probably not”.

Felix_Kütt: Thanks! I don’t really think this is sticky material. Hopefully blenderartists.org will have a place to collect resources such as this.

How’s he skinned btw? The body mesh has no weight paints on it or envelops. There is an underlying structure that has some weight painting, but its selective, and not in the finger area or toes.

Actually the body does use envelopes. You can’t see them because the skeleton is in “stick” drawing mode by default. The face uses weight painting.

looks like he’s using evelopes
the rig is really nice sketchy- I played with it just now, and I’m amazed at how many different solutions are there to what I normally do ( and how well it works) :slight_smile: I’ve never done a rig so cartoony, really fun stuff.
thanks for sharing :slight_smile: ,
I really like how you solved the “stretch” IK scaling in all directions… of course, it should be fixed in the constraint, but the workaround is quite fun :slight_smile: I never used the stretch IK feature because of the uniform scaling, but this is a nice tip. using IK (with rot) nicely solves twist propagation in the spine too…
someday, when I feel like animating again, I gotta do something with this rig

Especially if you nag him about it a lot. :wink:

I finished the walk tutorial, and I gave it to Enriq to try out. If he can understand it . . . no wait, I already made that joke to him. If he can understand it then I’ll post it. Do you want it here so that all ludwig’s resources are in one spot?

LGM

doh! why didn’t I think of that? >< works like a charm, thx ^^

Btw, if you drag the eye behind the head, it creates a weird zombie eye effect. The eyes twitch for some reason, and are all white. How did you rig the eyes? I noticed the fingers are action keyed.

Got driven by the cube in the midlde of the face, but also by the eye target. There is an armature for the face on layer 11 btw.

Awesome work, sketchy.

I quote you in agreement. The stretch option for IK is nearly useless in its present state. I think I read some where that there will be a copy size constraint in the next version. I think that will also offer some interesting work arounds. But like you said, the best solution would be for IK stretch to really stretch and not scale.

I’m glad you like the rig. I learned all of the driver stuff from looking at your mancandy rig. Also, at one point Ludwig’s eyelid setup was identical to Mancandy’s. I eventually changed it so that LGM would quit his whining about not having an aim target for the eyes ;).

LGM: You can post it here, but I also want to put it on the official Ludwig webpage.

womball: The eyes are pretty complex. They use a tracking constraint, several copy rotations, and a couple of IPO drivers. You don’t really need this much complexity, but I wanted to be able to animate the eyes using the face sliders, and LGM wanted to use a tracking target. So I comprimised. You can actually use both methods at the same time, though I don’t recommend it.

Gabio: Oops! I missed you’re post. I guess you wrote it while I was replying. Anyways, thanks! You are of course correct in your analysis of how the face works.

Keep in mind sketchy, I’m in school for animation. I do like complex things, if they make animation simpler. My eye setup is getting kind of old, its a real pain to append seperate objects into a scene rather than one mesh. Also using weight paints on the eyes can cause the object centers to act up. I was fighting this yesterday, but I managed to surpass it. I will have to check out layer 11.