Using OSC in Blender 2.5 Game Engine (with Kinect-captured data)

Hi all. I’m currently using OSCeleton to broadcast OSC messages with Kinect-captured skeletal data.

I’m having trouble getting Blender to listen to the port and pass the joint vectors to their relative object.worldPosition in the scene. I’m using the OSC python implementation pyLiblo, but it only allows you to push any received messages to a dedicated function rather than allowing you to receive and parse messages on-the-fly. The problem is that the function isn’t being called when pyLiblo receives a message (and I think it’s to do with the script being freshly called each trigger).

Here’s the script I’m trying to implement. Each of my objects in the scene are named after a joint name received from OSCeleton, and this script is a controller for just one of those joints (with a repeating Always sensor attached). It also has a game property called ‘connected’ that is set to True after an initial run:

import liblo
import bge

cont = bge.logic.getCurrentController()
obj = cont.owner
objects = bge.logic.getCurrentScene().objects

def init(port = 7110):
    liblo.Server(port).free()
    server = liblo.Server(port)
    server.add_method('/joint', None, joint_handler)
    print("Connected to server")

def run():
    while True:
        server.recv(33)

def joint_handler(path, args, types, src):
    print("Joint moving")
    jointId = args[0]
    userId = str(args[1])
    jointLoc = args[2:]
    objects[jointId].worldPosition = jointLoc
   
if obj['connected'] is False:
    obj['connected'] = True
    init(7110)
else :
    try:
        run()
    except:
        pass

If you can make sense of any of that (!), could someone please let me know what I can do to get pyLiblo to call the joint_handler function correctly? Also, let me know if any of that wasn’t clear enough.

Thanks. Al.

An example, if the above was a little confusing. In a new blender file, name the default cube: ‘torso’. Run osceleton in a terminal window. Now we need a way of linking the /joint messages received over OSC, in particular, the torso data (eg. [‘torso’, 1, locX, locY, locZ]) to the ‘torso’ object.

Ideally, this information will be received through a single script (as the script needs to parse every /joint message received in order to pass the location values to the correct joint object - having each object’s script parse the same data would be a waste of CPU).

In an ideal implementation of the script, each of the joints could be assigned an object using a GUI Panel. There could also be options for the scaling of captured skeletal data based on the target skeletal dimensions (so the skeletal data doesn’t warp the model). For the moment, however, let’s just focus on moving a single joint object :slight_smile:

Libraries I’m using:
OpenNI + Nite (latest unstable or stable, no matter)
OSCeleton (latest)
liblo 0.26
pyliblo 0.9.0 (python 3.1 compatible)

have a look at kai’s kinnect project and code. http://blenderartists.org/forum/showthread.php?t=202543

Thanks for the link. I’ve checked out the thread, and it seems to be following a different approach with the kinect data by using the raw depth data for 3D object creation. Their project could definitely use the libfreenect drivers (which include superb python wrappers out of the box). My own implementation, however, needs skeletal tracking, something which the libfreenect project is sadly lacking (for the moment, at least until someone develops an impressive opencv-based capture).

It also need to use the closed source Nite skeletal tracking middleware, which in turn needs OpenNI in order to operate. There aren’t any python wrappers for this software, so I’ve had to go along the OSC route in order to broadcast the captured data and then pick it up in Python. I’m pretty set on this combination, as I’m using Linux, and would like to think that the final setup could be OS independent.

This still brings me back to the problem of running an OSC server in Python. I’ve read something about creating global variables in GameLogic (or rather, bge.logic). Could I create a child of a class as a GameLogic variable?

I think the trick is you have to make one script for getting the OSC data (the server) that will route the pos in properties f.ex., then one script per object to manipulate thoses data and interpret them as movement…
so you just invoke once the osc server script per frame.

I’m working on the same problem, I’ve a working setup (from an older project long before kinect) under blender 2.49 with OSC back and forth between blender and puredata, but still have to implement it in 2.5x …

Thanks Olm-Z. That sounds like what I’m aiming for at the moment. It would be good to see your code from 2.49.

Here’s what I’ve created so far:

import bpy
import liblo

class OBJECT_PT_SkelMap(bpy.types.Panel):
    bl_label = "Skeleton mapping"
    bl_space_type = "PROPERTIES"
    bl_region_type = "WINDOW"
    bl_context = "scene"

    def draw(self, context):
        main_list = bpy.types.Scene.skelmap_main_list
        main_vals = bpy.types.Scene.skelmap_main_vals
        limb_list = bpy.types.Scene.skelmap_limb_list
        limb_vals = bpy.types.Scene.skelmap_limb_vals
        layout = self.layout
        scn = context.scene
        for m in main_list:
            row = layout.row()
            col = row.column()
            col.prop_search( scn, "skelmap_"+m, context.scene, "objects" )
        row = layout.row()
        for k in ['Left', 'Right']:
            col = row.column()
            col.label(k)
        for l in limb_list:
            row = layout.row()
            for k in [ 'l', 'r' ]:
                col = row.column()
                col.prop_search( scn, "skelmap_"+k+"_"+l, context.scene, "objects" )

class DumpOSC:
    def callback(self, path, args, types, src):
        try:
          self.options[path]
        except :
          i = 0
        else :
            self.options[path](path, args, types, src)

    def new_user_handler(self, path, args, types, src):
        print("---")
        print("FOUND USER " + str(args[0]))
        print("Now perform the Psi pose.")

    def new_skeleton_handler(self, path, args, types, src):
        print("---")
        print("SUCCESS USER " + str(args[0]))
        print("You are now being tracked!")

    def lost_user_handler(self, path, args, types, src):
        print("---")
        print("LOST USER " + str(args[0]))
        print("where are you?")

    def joint_handler(self, path, args, types, src):
        sides = ["l", "r"]
        jointId = args[0]
        userId = str(args[1])
        jointLoc = args[2:]
        try:
            i = bpy.types.Scene.skelmap_main_list.index(jointId)
            iList = "main"
        except ValueError:
            try:
                i = bpy.types.Scene.skelmap_limb_list.index(jointId[2:])
                iList = "limb"
            except ValueError:
                pass
        vals = getattr(bpy.types.Scene, "skelmap_"+iList+"_vals")
        if iList == "limb":
            vals[sides.index(jointId[0:1])][i] = jointLoc
        else:
            vals[i] = jointLoc
        try:
            setattr(bpy.types.Scene, "skelmap_"+iList+"_vals", vals)
        except:
            pass
        try:
            objName = getattr(bpy.context.scene, "skelmap_"+jointId)
            obj = bge.logic.getCurrentScene().objects[objName]
            obj.worldPosition = jointLoc
        except:
            pass

    def create_server(self, port = None):
        liblo.Server(port).free()
        self.server = liblo.Server(port)
        # register callback function for all messages
        self.server.add_method(None, None, self.callback)
        print("listening on URL: " + self.server.get_url())

    def __init__(self, port = None):
        self.options = { '/new_user' : self.new_user_handler,
                         '/lost_user' : self.lost_user_handler,
                         '/new_skeleton' : self.new_skeleton_handler,
                         '/joint' : self.joint_handler }
        # create server object
        try:
            self.create_server(port)
        except:
            pass

    def run(self):
        self.server.recv(33)

if __name__ == '__main__':
    try:
        bpy.types.Scene.skelmap_app
    except:
        # Create OSC listener
        bpy.types.Scene.skelmap_app = DumpOSC(7110)

    try:
        bpy.types.Scene.skelmap_app.run()
    except:
        del bpy.types.Scene.skelmap_app
        pass

    # Create joint lists and value dicts
    # Main joints first
    main_list = [ "head", "neck", "torso" ]
    main_vals = {}

    # Followed by joint pairs
    limb_list = [ "shoulder", "elbow", "hand", "hip", "knee", "ankle", "foot" ]
    limb_vals = [ {}, {} ]

    # Save lists to scene
    bpy.types.Scene.skelmap_main_list = main_list
    bpy.types.Scene.skelmap_main_vals = main_vals
    bpy.types.Scene.skelmap_limb_list = limb_list
    bpy.types.Scene.skelmap_limb_vals = limb_vals

    # Create UI fields for each joint: eg. skelmap_l_shoulder
    for m in main_list:
        setattr(bpy.types.Scene, "skelmap_"+m, bpy.props.StringProperty( name = m.title(), description = "Send location value to object" ))
    for l in limb_list:
        for k in [ 'l', 'r' ]:
            setattr(bpy.types.Scene, "skelmap_"+k+"_"+l, bpy.props.StringProperty( name = l.title(), description = "Send location value to object" ))

    # Create UI fields for other options

Here’s what happens at the moment:

  • The script creates a panel with fields for the user to link an object with an OSC joint type.
  • When DumpOSC.run()s, it receives an OSC message and runs the callback function related to the message type (joint_handler if a ‘/joint’ message is received).
  • The joint_handler puts the joint vector into a global array AND
  • The joint_handler updates the linked object’s world position.

At the moment, the global value arrays are being updated with each run(), but I’m getting no changes of position for any of the objects in the game engine.

Maybe this script could be added to an Empty in the scene as a single-run controller script, then another controller script could repeatedly run the run() function (which I think is probably what you suggested)?

This is only my second Python project, so I’m still finding my feet with this fantastic language (and the Blender 2.5 python API), so changes/streamlining/suggestions etc, are greatly appreciated. :slight_smile:

here a simple (old - from my files)working exemple with one object :
these 2 scripts are attached to a single object, but basicaly the first script is running the server and setting object properties according to an OSC message array of 2 floats, the second one is the one making it move…

of course the first one is better put on an empty and properly assign properties to objects, and I think it’s good to go…

so old python/API style :



import liblo, GameLogic

def oscserv_callback(path, args, types, src, data):
    a, b = args
    data[0] = a
    data[1] = b
    
if not hasattr(GameLogic,"oscServer"):
    # init, create a OSC server at port 5700
    GameLogic.oscServer = liblo.Server(5700)
    GameLogic.oscData = [0,0]
    GameLogic.oscServer.add_method("/your/path", 'ff', oscserv_callback, GameLogic.oscData)
    print "server set"
    
# loop and dispatch messages every tics
while GameLogic.oscServer.recv(0):
    pass

# register method 
ob = GameLogic.getCurrentController().getOwner()
(ob['propx'], ob['propy']) = GameLogic.oscData



import GameLogic

control = GameLogic.getCurrentController()
owner = control.getOwner()
act = control.actuators["propmover01"]
speed = (owner['propx'], owner['propy'])
act.dLoc=(0,0,speed[0])
act.dRot=(0,speed[1],0)
control.activate(act)


I personnaly find that the OSCeleton way of packing the messages is quite hard, as they could have used the path to distinguish between joints, instead of being obliged to parse the messages … but hey, nothing is perfect :wink:

edit : to the reflexion, using one single path is practical for this server function, as it can add as much joint/user as it wants on the fly without having to setup path in advance … we’ll see in practice.

Agreed. Although the devs updated it a day or two ago with this message: “Implemented message format compatible with Quartz Composer.”, which I think means that they’ve formatted the message to a /joint[jointname] format amongst other changes. I’ll try it out this afternoon.

And the blender game engine test now works! Yippee! Here’s a working blend file: http://dl.dropbox.com/u/212383/kinect-skeleton-test.blend, and here’s the script:

import bpy, bge
import liblo

class OBJECT_PT_SkelMap(bpy.types.Panel):
    bl_label = "Skeleton mapping"
    bl_space_type = "PROPERTIES"
    bl_region_type = "WINDOW"
    bl_context = "scene"

    def draw(self, context):
        main_list = bpy.types.Scene.skelmap_main_list
        limb_list = bpy.types.Scene.skelmap_limb_list
        layout = self.layout
        scn = context.scene
        for m in main_list:
            row = layout.row()
            col = row.column()
            col.prop_search( scn, "skelmap_"+m, context.scene, "objects" )
        row = layout.row()
        for k in ['Left', 'Right']:
            col = row.column()
            col.label(k)
        for l in limb_list:
            row = layout.row()
            for k in [ 'l', 'r' ]:
                col = row.column()
                col.prop_search( scn, "skelmap_"+k+"_"+l, context.scene, "objects" )

class DumpOSC:
    def callback(self, path, args, types, src):
        try:
          self.options[path]
        except :
          i = 0
        else :
            self.options[path](path, args, types, src)

    def new_user_handler(self, path, args, types, src):
        print("---")
        print("FOUND USER " + str(args[0]))
        print("Now perform the Psi pose.")

    def new_skeleton_handler(self, path, args, types, src):
        print("---")
        print("SUCCESS USER " + str(args[0]))
        print("You are now being tracked!")

    def lost_user_handler(self, path, args, types, src):
        print("---")
        print("LOST USER " + str(args[0]))
        print("where are you?")

    def joint_handler(self, path, args, types, src):
        sides = ["l", "r"]
        jointId = args[0]
        userId = str(args[1])
        jointLoc = args[2:]
        try:
            objName = getattr(bpy.context.scene, "skelmap_"+jointId)
            obj = bge.logic.getCurrentScene().objects[objName]
            obj.worldPosition = jointLoc
        except:
            pass

    def create_server(self, port = None):
        liblo.Server(port).free()
        self.server = liblo.Server(port)
        # register callback function for all messages
        self.server.add_method(None, None, self.callback)
        print("listening on URL: " + self.server.get_url())

    def __init__(self, port = None):
        self.options = { '/new_user' : self.new_user_handler,
                         '/lost_user' : self.lost_user_handler,
                         '/new_skeleton' : self.new_skeleton_handler,
                         '/joint' : self.joint_handler }
        # create server object
        try:
            self.create_server(port)
        except:
            pass

    def run(self):
        self.server.recv(33)

if __name__ == '__main__':
    try:
        bpy.types.Scene.skelmap_app
    except:
        # Create OSC listener
        bpy.types.Scene.skelmap_app = DumpOSC(7110)

    # Create joint lists and value dicts
    # Main joints first
    main_list = [ "head", "neck", "torso" ]

    # Followed by joint pairs
    limb_list = [ "shoulder", "elbow", "hand", "hip", "knee", "ankle", "foot" ]

    # Save lists to scene
    bpy.types.Scene.skelmap_main_list = main_list
    bpy.types.Scene.skelmap_limb_list = limb_list

    # Create UI fields for each joint: eg. skelmap_l_shoulder
    for m in main_list:
        setattr(bpy.types.Scene, "skelmap_"+m, bpy.props.StringProperty( name = m.title(), description = "Send location value to object" ))
    for l in limb_list:
        for k in [ 'l', 'r' ]:
            setattr(bpy.types.Scene, "skelmap_"+k+"_"+l, bpy.props.StringProperty( name = l.title(), description = "Send location value to object" ))

    # Create UI fields for other options

OK, here’s the current problem! I’ve created an armature with bones for each captured joint location, and I’ve also added cubes for each captured joint location. The cubes are bone constraints (damped track targets for each bone) so that when a new skeleton is captured, the joint location of the player doesn’t distort the character they’re playing as. Also the cubes are easier to alter the location of through blender game logic.

My problem is that in game mode, the armature doesn’t seem to affect its associated mesh. I can see that the cube locations are being updated by the kinect (hence why I’ve used cubes and not emptys as constraint targets - I need some kind of feedback), but I can’t tell if the cubes aren’t pulling the bones or if it’s a problem with the bones not contorting the mesh. Either way, my character remains motionless.

Here’s a link to the .blend file: http://dl.dropbox.com/u/212383/kinect-skeleton-test-0.4.blend. There’s no character to go with the armature in this file, but I’m sure anyone who’d like to test the script will be able to add one in minutes. Also, it’s worth noting that I’ve positioned the armature to face away from the player.

I’ve also added a new dependency on Numpy in Python. As Ubuntu only ships with a Python 2 version of Numpy, I had to build my own, but it’s really very easy and I would recommend it to anyone. Numpy allows me to make very simple and quick calculations on the received joint data before sending the vectors to their relative object locations. Anyway, feedback please!

Cheers. Al.

Update: I found out why the armature isn’t affected during Game mode. You need to add an Always sensor -> And controller -> Armature actuator and Run Armature in game logic, otherwise the armature doesn’t update and reposition itself automatically.

mm … getting stupid trouble here importing liblo and numpy … it works outside blender but Importerror inside the python console … hum

edit :
wohoo … it’s working ! I got to rebuild blender from source - cmake-gui is cool !), copy liblo.so and numpy dir in blender python lib dir then it takes it…

now looking at the skeleton problem (but I’m not an expert on that part :wink: )

I see you put more cubes than exported joints, so some are left in place when tracking…

Yeah. I’ve put in a few extra for the currently disabled joints in OSCeleton. I’m currently working on an automated script that creates the cubes for each joint-assigned bone before the game runs, although I’m having a great deal of trouble scripting the selecting of the bone before adding the constraint. Not sure if this is the best way to do it?!?

great going…hoping to implement this soon for some motion capture…what is the execution order of things to get this sample working…i get error no liblo…do i need to get that…and where does it go…i run a windows vista…i have all the nite and open ni stuff going…i use the miku miku environment to do capturing but i would love to move to blender for more features…thanks for moving us that direction

Hi vante. I’ve looked for instructions on how to build liblo for Windows, but can’t seem to find anything. However, the Open Sound Control page confidently states that it can be built for Linux, Windows and Mac, so it’s probably not impossible! I think maybe we need a Windows user to post some instructions.

And briefly, here’s a rundown of the tools you’ll need to have installed in order for this blend file to run correctly:

LibLO - Lightweight Open Sound Control implementation
http://liblo.sourceforge.net/

PyLibLO - Python bindings for LibLO (you need v.0.9 in order for it to work with Python 3.1)
http://das.nasophon.de/pyliblo/

OSCeleton - Convert kinect skeletal capture data to OSC messages

Numpy - Fast maths library for Python (again, you’ll need the latest version for it to work with Python 3)
http://numpy.scipy.org/

Numpy isn’t exactly required, as you could probably replace the np.array code in the blend file with a simple for-loop iterating over each value, but I love Numpy’s simplicity :slight_smile:

Finally, most of the problems you may experience will probably be related to Python 3. Although Blender 2.5 uses Python 3, it isn’t yet universally supported by python wrapper/module devs. This means I’ve been limited to a choice few python tools, which in turn have limited my choice of how to receive Kinect data. OSC is a nice option as it’s platform independent, and can also be received from a remote location (so receiving Kinect data from another computer/location is possible).

There is another option, however, that was released yesterday by Gamix on the OpenNI Google group. It’s a complete python wrapper for the OpenNI API (which includes calls to the Nite skeletal capture amongst other things). It may be worth looking at this as a better, faster, fewer-dependencies option in the future: https://code.google.com/p/onipy/

thanks for the reply…yeah i use the osceleton to send over a network to a machine running animata and i run miku miku dance on the machine with the kinect and both softwares utilize the info…miku uses a openni dll i saw that gamix post on openni…hoping that will be the magic bullet …when i get liblo installed and compiled i’ll let yoou know how it works…thanks again…

Well, I finally got the capture data to alter the armature in game mode. Here’s a rundown of what’s working so far.

  • The idea is that you can assign the following script to an armature as a logic block, and set it to run once as a game engine startup script.
  • Upon starting the game engine, the script immediately ends game mode and sets up a panel where you can track any bone in the armature to a captured skeletal joint.
  • Once you’ve assigned a bone or bones, entering game mode will then start the constraint creation process.
  • The script automatically creates cubes for each bone to track, and places them at the tail of each bone.
  • It also creates logic blocks and scripts so the game can keep updating the cube’s location with the captured data.

As far as script development goes, I’d like to create a script in which the automated features alter as little of the original blend file as possible, so the idea is that all of the cubes and constraints that are added automatically are also automatically removed once exiting game mode. This would definitely ease development work. I’ll also be adding visibility settings for the cubes in the panel.

The main work, however, will definitely be how the captured data is scaled to work with the armature. I’m sure I’m doing something wrong (and probably highly illogical to all you blender artists and armature enthusiasts out there), so please tell me if there’s a better way of linking the captured skeleton with the blender armature.

Here’s a video of what I’ve managed to capture so far. I’ll warn you, it’s not pretty (but it does show how the armature tracking works with the capture data):

Here’s my current .blend file (although I strongly suggest you create an armature yourself and assign the script to it in a logic block - it would help test the script and see whether I’ve missed something out!): http://dl.dropbox.com/u/212383/kinect-skeleton-test-0.10.blend

And finally, here’s the script in its current form: http://dl.dropbox.com/u/212383/skelmap-0.2.py

Update: The model, by the way, is taken from the following article: http://www.blendermagz.com/2010/02/12/batman-highpoly-lowpoly-character/

Awesome work.

I’d love to try, but I got tired just looking at all the things I had to install and configure before anything could begin to work :wink:

Good luck!

I agree, it is dependency hell :slight_smile: I’m working on a version that doesn’t foolishly rely on Numpy at the moment (sorry about that), so at least that will be one less thing to install! Also I’m attempting a joint remapping process that will hopefully mean that the armature retains its shape while still mimicking the player’s movement (that’s what’s taking the time).

BTW, I’m having to relaunch my file each time I want to run and check the script (due to the generated game logic blocks not deleting on game end - bug? not sure). However, I’m sure there’s an easier way. If anyone has tips for easier bug checking in blender’s python, then I’d be very appreciative!

This is just a quick update. I’ve updated the skelmap script with the following changes:

No need to select an armature to link with the skeletal capture - simply add the script to your blend file, then select your armature and create a startup sensor/controller setup in game logic.

http://dl.dropbox.com/u/212383/skelmap-setup.png

Upon first run, the game will run the setup script and create and assign the update and kill scripts automatically, then return you to object mode. It should have created a setup that looks like this:

http://dl.dropbox.com/u/212383/skelmap-complete-setup.png

I’m having a few problems assigning scripts to the Python controllers, so you may need to assign them manually for the moment - the scripts will have been created, though.

You’ll also find the Skeletal Mapping settings have appeared under the Object tab, where you can assign captured joints to bones in your armature. Once you’ve assigned joints to their appropriate bones, click the Reset tracking button to have each assigned bone automatically track to a generated cube. You can also choose to hide or show the cubes and alter their size in this panel.

http://dl.dropbox.com/u/212383/skelmap-options.png

Here’s the latest script: http://dl.dropbox.com/u/212383/skelmap-0.6.py

The script is definitely not yet complete, as the mapping of skeletal data to the armature isn’t completely accurate. I’m trying to create an algorithm that adjusts the joint lengths for any armature - so a tall thin player’s movements are adjusted proportionally to a short squat 3D character, for example. It isn’t working perfectly, but I’m getting there, so any help testing and adding to the code would be appreciated.

Have fun :slight_smile:

Hey I see you changed the pose by making the bones track those cubes. I was wondering if you can also control the angles between bones directly (i didn’t manage). As far as i know, NITE also calculates the angles of joints, which makes scaling of bones a lot easier.

Good point. When I first started coding the script I didn’t realise that I could alter bone head/tail/length in Blender game mode. For some reason my initial tests didn’t have any effect on the armature (I think it was the lack of an Armature actuator updating the armature).

However, I’m now working on a new version that positions and scales each bone along the captured joint vector, both solving the armature scaling problem (where the player’s joint lengths deform the armature) and removing the need to create hidden bone constraint objects. I’ll post a new version once its working.