Hi everyone,
I have been experimenting with various approaches to using the GE for Machinima for awhile now. Ultimately what I would like to develop is a reasonably straight forward “plug and play” approach to doing Machinima in the GE. One of my ideas is to save physics as IPO data so that any “animation” done in the GE in real-time can be tweaked frame by frame in the NLA and rendered out at a better resolution.
What I would like to be able to do is have an action triggered in the GE correspond to a specific shapekey or “keypose” in the NLA when the physics from a game is saved as IPO data. For example, triggering an armature bone to open a character’s mouth in GE would trigger a specific shapekey in the NLA. Or a simple texture in the GE is replaced with a much more complex one for animation.
From what I can tell all of this can be done manually fairly easily, but it’s a pretty tedious process. I’d like to figure out how to automate the process. Is there a fairly straightforward way to do something this? Or will it take some fancy Python scripting?
Any thoughts or suggestions would be most appreciated!
Have you had a look to see if IPO Drivers could help out?
I think the “Bake GE to IPO” currently in Blender only does Loc, Rot so not sure that it can record GE triggers. It’d be cool if it could though. Be like interactive recording of action, then fix the lighting, materials and cameras in post!
Hey - Isn’t that how they did Polar Express??
Just another thought before I hit submit: How about you link armature drivers to an object location then control the object in real time to manipulate the armature. Afterwards, move the object to an invisible layer and Bob’s your auntie you don’t invite to parties any more.
Yup, the idea is to do something very similar to Polar Express except you “puppeteer” the character using joysticks or some other kind of input device rather than having to do motion capture which isn’t practical for most people.
Once I get this problem solved, my next step is to work out colour tracking for controlling characters’ heads (there are demos of some doing that with Unreal Tournament 2004 here if anyone is interested), but for now I am mainly focused on figuring the IPO baking out.
I hadn’t thought of IPO drivers…so basically you are suggesting have shapekeys driven by armature bones in the GE? I didn’t realize that was possible, but it sounds like it would do the trick. Thanks!
Not sure how possible it is - May need to follow the yellow-brick road to Python City!
Does anyone else know if you can control shape keys in the Game Engine?
After experimenting a lot today, I am thinking that for now it really is better to focus on using the Game Engine to control a style of animation that is easy to do in real-time. I don’t like the look of most Machinima, but I am thinking something like Pocoyo would be fairly easy to do - http://www.youtube.com/watch?v=1XqqXOA4irI
If you watch the characters in that clip you’ll notice that they are animated basically using one animation cycle after the other. I don’t see why that can’t be done in real-time with the different animation cycles triggered on the fly by a puppeteer.
I hope to have some tests for all of this up in the next day or two.
It can be done easily enough by placing all the cycles in an IPO and playing the appropriate range of frames at each trigger.
Recording the sucker: Now you’re asking something!
Almost better to make a rough soundtrack for timing, then use the action editor to position actions manually.
Unfortunately, in all things, there’s easy and there’s versatile. There ain’t both!
By “recording” do you mean recording the game physics to IPO? I have heard that can chew up a lot of memory with a long complex scene, but haven’t actually run in to that problem yet.
Thanks for all your thoughts and insights Ammusionist, I really appreciate it.