I have a character that have created a walk cycle with. I want to make it so when the walk cycle plays a mesh animation will follow. An example of this would be having the characters hair or piece of clothing move in response to the armature "Action" walk cycle. I made my mesh an Actor. I linked a keyboard sensor to a IPO actuator, but my mesh animation doesn't play. What am i doing wrong? All imput is appreciated
It is possible to to this with Python, as python allows you to move each vertex individually. Ideally you would want a python script that would apply morph targets to a mesh, possibly using the same .morph file format that MakeHuman uses.
I was thinking of writing a script to do this when my own project needed it, but as far as I know there aren’t any scripts for doing it at the moment.
Morph targets aren’t all that complex tho, maybe someone else could have a stab at it…
Basically you need to store the XYZ coordinates of the vertices in a file using a non-realtime script. Then you move the mesh and store the coordinates again using the same script, but in another file.
Next you need another non-realtime script to compare the 2 files and create the .morph file, which will list the vertexes that changed, and give their XYZ displacements.
Now you need the realtime script. Each time it is called, it will move the vertices a certain percentage of the required displacement, according to how fast you want the displacement to occur. For example if the complete displacement for one vertex is [20.000, 10.000, 50.000], and you want the transformation to happen in 10 frames, then each frame the script will move the vertex [2.000, 1.000, 5.000].
I hope that helps. Its quite a task but its doable. :-?