LEAP intergration with blender game engine

How are you doing now? I’m also a Leap Motion Developer, but haven’t received my unit yet! I downloaded the Leap Motion SDK software and am studying it. I opted to use Python and C++ coding languages for my development attempts.

Let me introduce myself. My name is Paul Coones. My handle is upretirementman. I retired from Union Pacific Railroad years back. I’m 71 years old. I have been a Mac person since the beginning of Apple. I graduated from several computer programming schools in the San Francisco/San Jose California area. I have worked two full years as a IBM Autocoder/Cobol programmer/Analyst in the 60’s. I’ve learned Python myself and have been writing Blender import/export scripts and have several working. I also am studying C++, ObjectiveC and Cocoa. I have been accepted as a Leap Developer among thousands of other developers and am awaiting my Leap Motion Developer unit, which is free. Leap Motion has plans for a app store selling Leap applications developed by Leap developers. I assume they will be paying much to the same plan as Apple’s App Store to Leap Motion developers. I have downloaded all the Leap documentation and sample code in C++ and Python. It’s also available in C#, Java, and JavaScript.

Have you downloaded the latest version of Leap SDK, Version 0.7.1?

After you unzip the files into folders, be sure to look in the Examples Folder:

You will find several items:

Leap SDK
cpp

In the cpp folder, you will find a README.txt,
A folder containing FingerVisualizer,
A folder containing ThirdParty,
A folder containing MotionVisualizer.

In the FingerVisualizer folder, you will find a Builds folder:

This folder contains a compiled version, FingerVisualizer.app
When double clicked, it will open a window with a screen display.
After it is running, you may press ‘h’ for an on screen menu.
Then you can test out the features and see your unit displaying data.

Also try out the compiled version of MotionVisualizer in the MotionVisualizer folder.
Look in the Builds folder for the MotionVisualizer.app
Double click on it to activate another on screen demo.
Enter ‘h’ for the help menu and try each feature out.

So far another person has gotten successful with the Leap and Blender integration but I belive it was done by modifying the _leappython.pyd file and recompiling it in Visual Basic. I have been working tirelessly on it but my _leappython.pyd file is not compatable with the python 3 of blender. Or at least thats my opinion.

I have received my Leap unit and have now gotten it to work! I download a Leap application called “FlockingDemo” from Leap Developer website in the examples folder. I ran it and recorded parts of the results as two .mov files. I used a demo version of ScreenFlow and made a trailer:
[http://www.youtube.com/embed/ZTo7afCjmVI?feature=player_detailpage" frameborder=“0” allowfullscreen></iframe>"]
http://www.youtube.com/embed/ZTo7afCjmVI?feature=player_detailpage"](http://<iframe width=“640” height=“360” src="[url)

[video] <iframe width=“640” height=“360” src=“http://www.youtube.com/embed/ZTo7afCjmVI?feature=player_detailpage” frameborder=“0” allowfullscreen></iframe>[/video]

How does it look?

Hey, sorry I missed alot of your posts, I saw the video, it looks really good!

Any luck yet on getting the leap to work along side blender?

I have made this video with BGE :slight_smile: : https://www.youtube.com/watch?v=9pFpayaR6TU

If you want to use the blender wrapper you have to make your own _LeapPython.pyd, a tutorial in “Questions” exist

I am interested in reading the tutorial you mentioned. Where is “Questions”? Can you provide a link to the tutorial? Is it on the LeapMotion.com website?

Well, I took the case where I use a modified version of Leap’s python 2.7 displayer of frame data and added a client socket method to send frame data to Blender. In Blender I have written a python 3.3 script to accept the frame data with a modal server and decode the imported client string into keys/values of a dictionary of frame data. To use the frame data in Blender, just use the dictionary item and assign it your use by your own python 3.3 code.

Example:
frame = {‘palm_position’: (333.009, 342.33, 123.89)}
To use it:
hand_palm_position = frame[‘palm_position’]

Now leap’s values are way too high in some cases to use in Blender 3D screen, so I use

coef = 0.03
Then
a, b, c = tuple(hand_palm_position)
hand_palm_position = (acoef, bcoef, c*coef)
own.worldPosition = hand_palm_position
Then you can move object with:
bpy.ops.transform.translate(value = (own.worldPosition))
print(“moved”, own.name, "to new location at ", own.worldPosition).

Check out my 3d physics mouse cursor,

I think you could use it with this,

Left click when the box is on a cube and it picks it up and moves it with forces,

you could gut this, and just some of the python, to have the object move and or track the leap controler

Attachments

MousePhysicsDemo (Current).blend (984 KB)

I downloaded it and tried it out on my MacMini Mountain Lion MAC OSX 10.8.5 and Blender 2.68. It’s just what I need for my demo, except for the mouse control and I’ll use Leap frame data instead. I’m sure you would love to have the frame data for your demo also?

You know we can’t use Leap motion directly in Mac Blender so far. What I have been working on is compiling LeapPython.so for python 3.3 and Blender & Mac OSX 64 bit operating system. I just can’t get swig installed yet. Perhaps someone can help out with that project? Does Mountain Lion Xcode 5 come with Swig installed?

To get LeapPython.cpp we need to do:

swig -c++ -python -o LeapPython.cpp -interface LeapPython Leap.i

Then do:

clang++ -arch i386 -arch x86_64 -I/Library/Frameworks/Python.framework/versions/3.3/include/python3.3m LeapPython.cpp

I think it’s suppose to produce a LeapPython.py at the same time and that should be acceptable in Blender Text Window as an import.

Another camera you might want to consider:
http://click.intel.com/intelsdk/Creative_Interactive_Gesture_Camera_Developer_Kit-P2061.aspx
And the SDKs:
http://software.intel.com/en-us/vcsource/tools/perceptual-computing-sdk
http://software.intel.com/en-us/articles/the-intel-skeletal-hand-tracking-library-experimental-release

It gives you access to the raw data streams, skeletal hand tracking with full 6 DOF and 17 bones, face tracking, voice recognition, etc. whereas LEAP tends to hide things…(I have the LEAP devices also)

I’ve already got an early prototype:

I wrote a C++ program that feeds data to an IPC system->BGE, over TCP local sockets (on Linux, I would use UDS), although I would prefer to create a Python module. It seems laggy in the video but that’s due to the way I implemented it/calculate things in the BGE, in reality the latency is minimal.

I played with the MousePhysicsDemo for a while this evening and it’s fun. It reminds me of the game of tag we played as kids! You tag one of the cubes and then it chases you around until it get back in your empty bin. You could make a neat game out of it, Just keep track of a score of some kind so that it would eventually end. As it is, you can’t keep away from the cube chasing you, as it always wins. If you go down through the platform, it knocks the cube out of your bin for a time, until he eventually finds you again by chasing you. I like the way you can go round and round the area, trying to get away from the cube!

the cubes can be assembled :smiley:
<a href=“http://www.youtube.com/watch?v=NscWYvFw0BY” target="_blank">
https://youtu.be/NscWYvFw0BY

the multicolored cubes are actually 3d logic nodes, press T to send the property “Fire”

There was no error in that statement. The error is in the previous statement not a comment. Check for matching “(” and “)”. There should be a closing “)” for ever opening “(”. If you’re missing one or the other, then python gives a Syntax error.