If I run that from the game engine nothing happens in the game engine, but the model does change.
My question is how to achieve that same movement in the game engine?
I see some reference to setting up a keyframe and a channel, but that seemed like it needed to have a particular state defined. This pose can take on any real value in the range (-2.0, +2.0). Ultimately I’ll be feeding that value into blender from an external source via python code. This will be used as a demo feed from external “stimulus”.
It’s easy enough for a simple bone, which means it’s not parented to (or extruded from) another bone. It also doesn’t have IK. If this is your case:
from bge import logic
cont = logic.getCurrentController()
own = cont.owner
scene = own.scene
armature = scene.objects["YourArmatureHere"]
bone = armature.channels["YourBoneHere"]
up = cont.sensors["up"]
loc = bone.location
if up.positive:
loc.y += 0.05 #this moves the bone up on the local Y axis
bone.location = loc
If you do have IK or parenting, look into joint_rotation. You can watch this video if you want a tutorial:
Looks at this script as well:
main_arm.channels['NAME OF THE BONE YOU WANT TO ROTATE'].joint_rotation[ x, y ,z]
as a last resort, you could just make the eyes a separate object and use a track to actuator.
or you could rotate an empty object, then use a copy rotation constraint. but armature constraints need to have an animation running on them to work (run armature doesnt always work).
or you could go back to using separate objects, but run logic on an empty. then do:
ori = empty.localOrientation
left_eye.localOrientation = ori
right_eye.localOrientation = ori