Results 1 to 19 of 19
  1. #1
    Member rextherunt's Avatar
    Join Date
    Sep 2017
    Location
    Cape Town, South Africa
    Posts
    40

    Live motion capture and object interaction with Kinect + Ni-Mate

    Hi there

    I'm new to Blender and have been doing intensive tutorials for days in order to test a proof of concept. I'm starting to lose heart, I don't know if I'm on the right track, or whether what I'm trying to achieve is even possible.

    I want to control my character with a motion sensor (Kinect) using the Ni-Mate add-on, and allow it to interact with other objects in game mode - stand-alone player if possible.

    I have rigged a character and hooked it up to the Kinect using the Ni-Mate python script. So far so good, it works well.

    Then I wanted to have my character interact with other objects, so I started doing simple game tutorials to find out how the BGE worked. That went well too.

    Then I started trying to bring my character into the game environment, and hit various problems. For example, the game player crashes when started with certain characters (I've downloaded some from Blend Swap to test with). I'm also struggling to understand why in game mode the characters become grey, or the rendering is so different, it's almost unrecognizable.

    I am prepared to continue struggling through this, but I need some advice as to whether this is advisable or possible. Has anyone been doing work like this, and had success?

    I would appreciate some feedback!



  2. #2
    Member BluePrintRandom's Avatar
    Join Date
    Jul 2008
    Location
    NoCal Usa
    Posts
    18,232
    1 - bge uses UV maps / GLSL or multiTexture mode
    2 - Game riggs typically are baked down low poly

    what are the viewport settings ? (glsl, multitexture?) your characters may be for cycles.

    what is the polycount?
    Break it and remake it - Wrectified
    If you cut off a head, the hydra grows back two.
    "headless upbge"



  3. #3
    Member sdfgeoff's Avatar
    Join Date
    May 2010
    Location
    Kalpana One
    Posts
    5,073
    One of my friends at uni did pretty much exactly this: use a kinect to control human characters in BGE. So don't despair...

    Items for in-game have quite different requirements to those for rendering. So make sure the ones you downloaded from blendswap are designed for realtime use.
    "Someone applied a roof texture to that wall" - martinsh

    Website: www.sdfgeoff.space



  4. #4
    Member rextherunt's Avatar
    Join Date
    Sep 2017
    Location
    Cape Town, South Africa
    Posts
    40
    Originally Posted by BluePrintRandom View Post
    1 - bge uses UV maps / GLSL or multiTexture mode
    2 - Game riggs typically are baked down low poly

    what are the viewport settings ? (glsl, multitexture?) your characters may be for cycles.

    what is the polycount?
    Thanks for your response. The character is made for cycles like you suggested. So the faces count is over 8000, I guess that's a bad thing.



  5. #5
    Member rextherunt's Avatar
    Join Date
    Sep 2017
    Location
    Cape Town, South Africa
    Posts
    40
    Originally Posted by sdfgeoff View Post
    One of my friends at uni did pretty much exactly this: use a kinect to control human characters in BGE. So don't despair...

    Items for in-game have quite different requirements to those for rendering. So make sure the ones you downloaded from blendswap are designed for realtime use.
    Thanks for the encouragement. I'm really new to this. I think what you're saying is don't be attracted to those stunning looking characters - they'll never work in the game environment.
    So I don't waste your time - is there a guide you can point me to for creating or adapting characters for realtime use? I've read that baking is an option, but I'm not sure if that's the right way to go.



  6. #6
    Originally Posted by rextherunt View Post
    Thanks for the encouragement. I'm really new to this. I think what you're saying is don't be attracted to those stunning looking characters - they'll never work in the game environment.
    So I don't waste your time - is there a guide you can point me to for creating or adapting characters for realtime use? I've read that baking is an option, but I'm not sure if that's the right way to go.
    never say never.

    8000 isnt that bad really. lod is here for a reason, even one low poly distance step would do wonders in a populous area. baking normals is a great way to preserve the details with low poly.

    once you get familiar with bge materials and nodes, its very possible to get some really pretty materials. due to all the fakes required, sometimes youll find its easier to achieve a desired look then in cycles.

    blender internal uses the same material ui that bge does. so BI mat tuts will likely be meaningful.
    System "IVAN" (rev 1.3b) - Win7 64bit - Blender 2.74:
    CPU- Intel i3-3220 3.30 Ghz | GPU- EVGA GTX 970 | RAM- GSkill Ares 16GB 1600 Mhz | MB- ASUS P8Z77-V LK



  7. #7
    Member rextherunt's Avatar
    Join Date
    Sep 2017
    Location
    Cape Town, South Africa
    Posts
    40
    Originally Posted by Daedalus_MDW View Post
    never say never.

    8000 isnt that bad really. lod is here for a reason, even one low poly distance step would do wonders in a populous area. baking normals is a great way to preserve the details with low poly.

    once you get familiar with bge materials and nodes, its very possible to get some really pretty materials. due to all the fakes required, sometimes youll find its easier to achieve a desired look then in cycles.

    blender internal uses the same material ui that bge does. so BI mat tuts will likely be meaningful.
    Thanks for your input. I don't understand half the words and acronyms you used, but I will persevere!



  8. #8
    Member sdfgeoff's Avatar
    Join Date
    May 2010
    Location
    Kalpana One
    Posts
    5,073
    You are standing at an intersection between two roads. There is a sign post nearby.
    >> look at sign post
    The sign post has two arrows on it, one to the east and one to the west. The arrow to the east reads:
    - Learn how to make a model look good in BGE, and spend a week or two learning BGE's rendering system
    The arrow to the west reads
    - Focus on the kinect integration and download a model that already looks good in BGE

    To the east you can see a glittering golden city, but it is a very long way off, and the people walking along that road have some odd glitches. To the west you see a rather plain looking road, but at least the people walking along it are walking sensibly.
    >>

    --------------------------------------------------------------

    In my opinion you should decide if you want to focus on the visuals of the project. Things like this are possible in BGE (not by me):


    Or, it seems to me that this project is more on the Kinect side, where the visuals are secondary. But it's up to you.

    If you do decide to focus on the visuals, you'll probably become familiar with the material node editor.....
    "Someone applied a roof texture to that wall" - martinsh

    Website: www.sdfgeoff.space



  9. #9
    Member rextherunt's Avatar
    Join Date
    Sep 2017
    Location
    Cape Town, South Africa
    Posts
    40
    Originally Posted by sdfgeoff View Post
    Or, it seems to me that this project is more on the Kinect side, where the visuals are secondary. But it's up to you.

    If you do decide to focus on the visuals, you'll probably become familiar with the material node editor.....
    That's a good way of looking at it. At the moment I am in the proof of concept stage - I need to know that this will work in practice before I spend weeks/months/years learning Blender.

    So it's probably best that I just get some interaction going in the game engine with a very simple stick man, and then build up from there.

    I needed to get something working quickly, so I came up with this somewhat convoluted solution:
    I have a character being controlled by Kinect via Ni-Mate add-on. It is in Cycles Render. I use Syphoner to get the fullscreen viewport to show up in Quartz Composer (using Syphon Client). Here I have some 2D "objects". These are affected by either OSC or MIDI triggers which are also received from Ni-Mate. I can vaguely match up the objects' position so that the character appears to be hitting them.
    The good thing about this is that I can use a detailed character with no problem, but the downside is obviously that it is not true interaction, and the Quartz Composer logic to do interactions is not ideal.
    MacBook Pro 11,4
    2,2 GHz intel Core i7 | 16 GB 1600 MHz DDR3 | Intel iris Pro 1536 MB | OS 10.12.6 (Sierra)



  10. #10
    Member rextherunt's Avatar
    Join Date
    Sep 2017
    Location
    Cape Town, South Africa
    Posts
    40
    Originally Posted by sdfgeoff View Post
    One of my friends at uni did pretty much exactly this: use a kinect to control human characters in BGE. So don't despair...
    Do you or anyone else have someone I can contact to ask a few questions?
    I've got quite far, but I'm stuck at one major obstacle:
    I can receive the kinect data and move my rigged character around in object mode, but as soon as I hit Play it freezes. The whole aim is to have all the functionality of the game environment, with the character being controlled by the kinect.

    I have a feeling it's to do with the NI-Mate game logic, my guess is I need to hook the sensor up with the armature? (see screenshot)
    Anyone who has done this - I'd appreciate some help!
    Screen Shot 2017-10-11 at 4.33.54 PM.png
    MacBook Pro 11,4
    2,2 GHz intel Core i7 | 16 GB 1600 MHz DDR3 | Intel iris Pro 1536 MB | OS 10.12.6 (Sierra)



  11. #11
    Member BluePrintRandom's Avatar
    Join Date
    Jul 2008
    Location
    NoCal Usa
    Posts
    18,232
    in the viewport the agent is probably driven by drivers or bpy?

    post the code? ?
    Break it and remake it - Wrectified
    If you cut off a head, the hydra grows back two.
    "headless upbge"



  12. #12
    Member rextherunt's Avatar
    Join Date
    Sep 2017
    Location
    Cape Town, South Africa
    Posts
    40
    Originally Posted by BluePrintRandom View Post
    in the viewport the agent is probably driven by drivers or bpy?

    post the code? ?
    I'm not sure which code you mean, should I post the .blend file?
    MacBook Pro 11,4
    2,2 GHz intel Core i7 | 16 GB 1600 MHz DDR3 | Intel iris Pro 1536 MB | OS 10.12.6 (Sierra)



  13. #13
    Member rextherunt's Avatar
    Join Date
    Sep 2017
    Location
    Cape Town, South Africa
    Posts
    40
    I think I might have almost solved it - it could have been that the sensor and python script were applied to the result armature instead of the capture armature.
    I'm still testing - seems very buggy. Sometimes embedded player works, but standalone player never works.
    I don't think this bodes well for creating a standalone game.
    MacBook Pro 11,4
    2,2 GHz intel Core i7 | 16 GB 1600 MHz DDR3 | Intel iris Pro 1536 MB | OS 10.12.6 (Sierra)



  14. #14
    Member WKnight02's Avatar
    Join Date
    Aug 2017
    Location
    Earth
    Posts
    55
    In case I found this: https://github.com/Kinect/PyKinect2
    If the plugin is not designed in a way that fits your needs, you might consider looking at the PyKinect2 repository in order to see what can be done :)

    This is pretty much uncharted territories for amateur game devs, especially in the BGE, but everything has a start :P



  15. #15
    Member rextherunt's Avatar
    Join Date
    Sep 2017
    Location
    Cape Town, South Africa
    Posts
    40
    Originally Posted by WKnight02 View Post
    In case I found this: https://github.com/Kinect/PyKinect2
    If the plugin is not designed in a way that fits your needs, you might consider looking at the PyKinect2 repository in order to see what can be done

    This is pretty much uncharted territories for amateur game devs, especially in the BGE, but everything has a start :P
    Thanks for the link. I'm definitely way under-qualified to be trying this! I'm not a programmer and that Python stuff is beyond me.
    I realise it's a bit crazy for a beginner to be getting into this level...but that's pretty much how I start everything I do. Blindly going uphill in the dark, sometimes emerging in the light (with lots of falls and headaches on the way).
    MacBook Pro 11,4
    2,2 GHz intel Core i7 | 16 GB 1600 MHz DDR3 | Intel iris Pro 1536 MB | OS 10.12.6 (Sierra)



  16. #16
    Member rextherunt's Avatar
    Join Date
    Sep 2017
    Location
    Cape Town, South Africa
    Posts
    40
    Just a quick update to say that I've found a little glitch - maybe it's already been pointed out in another context? If I start the Ni-mate plugin and then hit P, nothing happens. I have to escape and then hit P again, and it works. It works both in the embedded and stand-alone player.

    Now I'm in search of someone who can help me to adapt the python script to start the Ni-Mate receiver in a stand-alone executable. Any advice on where to post such a request?
    MacBook Pro 11,4
    2,2 GHz intel Core i7 | 16 GB 1600 MHz DDR3 | Intel iris Pro 1536 MB | OS 10.12.6 (Sierra)



  17. #17
    Member rextherunt's Avatar
    Join Date
    Sep 2017
    Location
    Cape Town, South Africa
    Posts
    40
    Argh! I'm going nuts! Is there anyone who can help me?
    I've tested my setup with the Ni-mate example bunny blend, and it works (although my character is crazy deformed when it gets the data). When I say works, I mean I can receive data from the Kinect to move the character, and when I hit Play, it continues to work.

    I then tried to take the same simple armature and attach a mesh to it. Even creating one simple cube and connecting it to the armature won't play: In object mode I can see the cube following the armature around, but as soon as I hit play it freezes. I have another cube spinning to check that the game is actually playing, not freezing entirely.

    Here is a short screengrab to demonstrate.
    Here is a very basic file of my test. ni-mate-game test 1.blend

    I must be missing something very basic. I have tried to compare the bunny mesh to see what it's properties are, but I cannot for the life of me figure it out.

    I'd really appreciate some advice!
    MacBook Pro 11,4
    2,2 GHz intel Core i7 | 16 GB 1600 MHz DDR3 | Intel iris Pro 1536 MB | OS 10.12.6 (Sierra)



  18. #18
    Member WKnight02's Avatar
    Join Date
    Aug 2017
    Location
    Earth
    Posts
    55
    I think its because the armature is not moving in the GE...
    Do you have any errors in the Console/Terminal ?

    (I don't have a kinect so I'm blind here :/)



  19. #19
    Member rextherunt's Avatar
    Join Date
    Sep 2017
    Location
    Cape Town, South Africa
    Posts
    40
    Originally Posted by WKnight02 View Post
    I think its because the armature is not moving in the GE...
    Do you have any errors in the Console/Terminal ?

    (I don't have a kinect so I'm blind here :/)
    Thanks for your suggestion...
    No errors in the console:
    Delicode NI mate Tools started listening to OSC on port 7000
    Blender Game Engine Started
    Blender Game Engine Finished
    The GE finishes when I hit escape, it's not crashing
    Is there any way of making the armature visible in the GE to test?

    By the way, there's a free version of Ni-Mate and it can be run without a camera/sensor. They provide an example clip with sensor data for testing skeleton tracking.
    Last edited by rextherunt; 14-Oct-17 at 04:41. Reason: typo
    MacBook Pro 11,4
    2,2 GHz intel Core i7 | 16 GB 1600 MHz DDR3 | Intel iris Pro 1536 MB | OS 10.12.6 (Sierra)



Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •