Animated normal maps for face wrinkles?

Don’t know much about this and was wondering if it’s possible with the BGE. Anybody want to take a try?

Here’s my quickly done feeble attempt using animated transparency:

Edit: Btw the head model in my .blend is from makehuman, which is an awesome program.


Wrinkles.blend (1.4 MB)

That already looks pretty nifty,

but I think that a slick animation of faces that were baked out from the system into a atlas, could be faster.

Though each head would need a atlas.

Bpr - How would you make it so the animation would play when he lifts just one eyebrow? I could see doing it all with logic and transparency animations, but that would be a lot a logic. It would be nice to have it done automatically after a shape key transformation and it would know at what area of the face to stop the animation, but I guess that would take a node setup or something else I don’t understand.

Anybody else have a simple solution to this?

ok, so one thought, you would need a node, that made a spot on the face that was blacker the higher a value was, and use it for blending maps, but then you would need 6+ of these property based mixing nodes,

I don’t know nodes that well.

I think ace dragon or carlo may be your best bet.

I don’t really understand what is happening in the example blend, because the wrinkle texture is no normal map is it? Anyway, it’s perfectly possible with actions on normal maps. This would mean you would need a lot of texture channels to be able to slide through facial expressions.

Look at the logic bricks, I have a sensor from the head telling the forehead texture to do the action actuator at the same time the head does it. So both objects were once one object, and I made the shape keys on that object and then split the part right by the forehead, so that I could add another animation to it, that being the transparent one.

So how would I get the shape keys to activate the blending of the normal maps at the right moments in logic or python?

I would just like to know if what’s in the video can be replicated in blender, it doesn’t matter how it’s done, all that matters is that I get the same effect, hopefully on the cheap.

I’m sorry for not being so clear. What I meant was playing a material action. But unfortunately this doesn’t seem to work in the Game Engine. I always assumed it would but now I tried playing a material action it doesn’t seem to update. I don’t know if this is a bug or if it’s simply not implemented yet. Anyway, I think material actions should be redesigned anyway, because you can’t even give proper names to them…


MaterialAction_bug.blend (84.6 KB)

Yeah I don’t think those ever worked for glsl. I think moguri said something about fixing it after he takes out multitexture.

In that case the answer is that you can’t. At least, in a proper way. Because you could use animated textures (uv scroll), but it would be insane to go for that approach.

Ok so you can’t animate the normal levels themselves, but you can animate the transparency of the normals on an object using object color like how I animated the transparency of the diffuse in the first .blend. The normals show up fine but you have to use the material the normal texture is on to match the one of the head, so it’s not perfect.


Wrinkles.blend (1.42 MB) if you want to animate the texture alpha transparency, you can take a look at this thread, but I don’t know if it is what you want. Another interessant thing:

@ Geometricity: This could be a solution, but it’s limited. It would mean that you would have to use a separate mesh for every single expression. I think that’s a bit trivial.

Raco - Yes and the logic bricks would add up but I suppose I could get it to something manageable. Thanks for helping.

Youle, I’m not really a code guy, just a simple artist. But thank you for sharing, maybe someone else can find a solution with it.

I will try to update this thread when I make some progress. If anyone else has something to add, please do.

That looks like a nice feature.

Why no use nodes? use nodes and use some value for the intensity of the normal map, nodes cannot get acces to game properties, but can get information from lamps, and other things. With this way you can have the head in only one object.

What i thinking is use nodes in the head, set for use the intensity of a lamp as intensity of the normal map, later use python to change the intensity of the lamp depending of the frame from the action.

All the lamps will have deactive the diffuse and specular options, of this manner it not will consume performance.

EDIT: from what I just saw, the lamp data node cannot get the intensity from a lamp, but, can get the color, and you can use it as intensity for the normal map and change the color of the lamp(from the black, to the white).

Have a look here:

Thanks, iPLEOMAX, this makes it possible to fade in/out up to three normal maps per mesh, using the Separate RGB Converter Node.

You can make a whole normal map (face wrinkles, clothes wrinkles on joints and so on) and mix it with Dynamic Paint, works on Cycles and EEVEE. Then make the object approach the region where you want the wrinkles to show up.

The only problem that I can think of is if the arm fold comes close to the face for example, maybe it works if the painting mesh stays “inside” the mesh and it paints as if comes closer to the surface.

Good luck!

Now that we are in eevee we can feed shader uniforms via PY very easy,

We can use 2 bones distance apart etc to drive anything in a shader graph,

Before we only really had object color *
I think gives you hooks into the nodes in py
I will get actual code and post in a minute, my daughter is home sick on my pc.


Oof. I’m sure OP has either solved this or long given up by now.