Live motion capture and object interaction with Kinect + Ni-Mate

Hi there

I’m new to Blender and have been doing intensive tutorials for days in order to test a proof of concept. I’m starting to lose heart, I don’t know if I’m on the right track, or whether what I’m trying to achieve is even possible.

I want to control my character with a motion sensor (Kinect) using the Ni-Mate add-on, and allow it to interact with other objects in game mode - stand-alone player if possible.

I have rigged a character and hooked it up to the Kinect using the Ni-Mate python script. So far so good, it works well.

Then I wanted to have my character interact with other objects, so I started doing simple game tutorials to find out how the BGE worked. That went well too.

Then I started trying to bring my character into the game environment, and hit various problems. For example, the game player crashes when started with certain characters (I’ve downloaded some from Blend Swap to test with). I’m also struggling to understand why in game mode the characters become grey, or the rendering is so different, it’s almost unrecognizable.

I am prepared to continue struggling through this, but I need some advice as to whether this is advisable or possible. Has anyone been doing work like this, and had success?

I would appreciate some feedback!

1 - bge uses UV maps / GLSL or multiTexture mode
2 - Game riggs typically are baked down low poly

what are the viewport settings ? (glsl, multitexture?) your characters may be for cycles.

what is the polycount?

One of my friends at uni did pretty much exactly this: use a kinect to control human characters in BGE. So don’t despair…

Items for in-game have quite different requirements to those for rendering. So make sure the ones you downloaded from blendswap are designed for realtime use.

Thanks for your response. The character is made for cycles like you suggested. So the faces count is over 8000, I guess that’s a bad thing.

Thanks for the encouragement. I’m really new to this. I think what you’re saying is don’t be attracted to those stunning looking characters - they’ll never work in the game environment.
So I don’t waste your time - is there a guide you can point me to for creating or adapting characters for realtime use? I’ve read that baking is an option, but I’m not sure if that’s the right way to go.

never say never.

8000 isnt that bad really. lod is here for a reason, even one low poly distance step would do wonders in a populous area. baking normals is a great way to preserve the details with low poly.

once you get familiar with bge materials and nodes, its very possible to get some really pretty materials. due to all the fakes required, sometimes youll find its easier to achieve a desired look then in cycles.

blender internal uses the same material ui that bge does. so BI mat tuts will likely be meaningful.

Thanks for your input. I don’t understand half the words and acronyms you used, but I will persevere!

You are standing at an intersection between two roads. There is a sign post nearby.
>> look at sign post
The sign post has two arrows on it, one to the east and one to the west. The arrow to the east reads:

  • Learn how to make a model look good in BGE, and spend a week or two learning BGE’s rendering system
    The arrow to the west reads
  • Focus on the kinect integration and download a model that already looks good in BGE

To the east you can see a glittering golden city, but it is a very long way off, and the people walking along that road have some odd glitches. To the west you see a rather plain looking road, but at least the people walking along it are walking sensibly.
>>


In my opinion you should decide if you want to focus on the visuals of the project. Things like this are possible in BGE (not by me):

Or, it seems to me that this project is more on the Kinect side, where the visuals are secondary. But it’s up to you.

If you do decide to focus on the visuals, you’ll probably become familiar with the material node editor…

That’s a good way of looking at it. At the moment I am in the proof of concept stage - I need to know that this will work in practice before I spend weeks/months/years learning Blender.

So it’s probably best that I just get some interaction going in the game engine with a very simple stick man, and then build up from there.

I needed to get something working quickly, so I came up with this somewhat convoluted solution:
I have a character being controlled by Kinect via Ni-Mate add-on. It is in Cycles Render. I use Syphoner to get the fullscreen viewport to show up in Quartz Composer (using Syphon Client). Here I have some 2D “objects”. These are affected by either OSC or MIDI triggers which are also received from Ni-Mate. I can vaguely match up the objects’ position so that the character appears to be hitting them.
The good thing about this is that I can use a detailed character with no problem, but the downside is obviously that it is not true interaction, and the Quartz Composer logic to do interactions is not ideal.

Do you or anyone else have someone I can contact to ask a few questions?
I’ve got quite far, but I’m stuck at one major obstacle:
I can receive the kinect data and move my rigged character around in object mode, but as soon as I hit Play it freezes. The whole aim is to have all the functionality of the game environment, with the character being controlled by the kinect.

I have a feeling it’s to do with the NI-Mate game logic, my guess is I need to hook the sensor up with the armature? (see screenshot)
Anyone who has done this - I’d appreciate some help!


in the viewport the agent is probably driven by drivers or bpy?

post the code? ?

I’m not sure which code you mean, should I post the .blend file?

I think I might have almost solved it - it could have been that the sensor and python script were applied to the result armature instead of the capture armature.
I’m still testing - seems very buggy. Sometimes embedded player works, but standalone player never works.
I don’t think this bodes well for creating a standalone game. :frowning:

In case I found this: https://github.com/Kinect/PyKinect2
If the plugin is not designed in a way that fits your needs, you might consider looking at the PyKinect2 repository in order to see what can be done :slight_smile:

This is pretty much uncharted territories for amateur game devs, especially in the BGE, but everything has a start :stuck_out_tongue:

Thanks for the link. I’m definitely way under-qualified to be trying this! I’m not a programmer and that Python stuff is beyond me.
I realise it’s a bit crazy for a beginner to be getting into this level…but that’s pretty much how I start everything I do. Blindly going uphill in the dark, sometimes emerging in the light (with lots of falls and headaches on the way).

Just a quick update to say that I’ve found a little glitch - maybe it’s already been pointed out in another context? If I start the Ni-mate plugin and then hit P, nothing happens. I have to escape and then hit P again, and it works. It works both in the embedded and stand-alone player. :smiley:

Now I’m in search of someone who can help me to adapt the python script to start the Ni-Mate receiver in a stand-alone executable. Any advice on where to post such a request?

Argh! I’m going nuts! Is there anyone who can help me?
I’ve tested my setup with the Ni-mate example bunny blend, and it works (although my character is crazy deformed when it gets the data). When I say works, I mean I can receive data from the Kinect to move the character, and when I hit Play, it continues to work.

I then tried to take the same simple armature and attach a mesh to it. Even creating one simple cube and connecting it to the armature won’t play: In object mode I can see the cube following the armature around, but as soon as I hit play it freezes. I have another cube spinning to check that the game is actually playing, not freezing entirely.

Here is a short screengrab to demonstrate.
Here is a very basic file of my test. ni-mate-game test 1.blend (1.99 MB)

I must be missing something very basic. I have tried to compare the bunny mesh to see what it’s properties are, but I cannot for the life of me figure it out.

I’d really appreciate some advice!

I think its because the armature is not moving in the GE…
Do you have any errors in the Console/Terminal ?

(I don’t have a kinect so I’m blind here :/)

Thanks for your suggestion…
No errors in the console:
[INDENT=2]Delicode NI mate Tools started listening to OSC on port 7000
Blender Game Engine Started
Blender Game Engine Finished

[/INDENT]
The GE finishes when I hit escape, it’s not crashing :wink:
Is there any way of making the armature visible in the GE to test?

By the way, there’s a free version of Ni-Mate and it can be run without a camera/sensor. They provide an example clip with sensor data for testing skeleton tracking.

I think I’ve found the solution/s - maybe this will help someone else.
It seems that the vertex groups of the mesh did not match the names of the bones in the armature - when I renamed them it started moving. So when I add a simple cube, I had to create a vertex group with a name that matched a bone in the armature to move in game play mode. At least I think so!
The second thing I noticed when using an existing mesh, is that the armature modifier has to come before any others. So the imported one had a subsurface modifier and the armature was placed afterwards. When I switched them it started moving.

The confusing thing is that everything appeared to work in the 3D view, it just stopped moving when in game play mode.