Blender and the Leap Motion

Leap Motion is a motion tracking devices that takes your fingers and converts it into motion in a 3D environment on your computer. More detail at their website: https://www.leapmotion.com/ and on Youtube.

Would Blender have any good implications to this piece of hardware? I think it would be something rather interesting to work with, being able to manipulate objects in a scene with your own hands or maybe even sculpting utensils as it also supports foreign objects. Would it be possible to implant something like this into blender?

Ton got them to seed a dozen or so devices to blender coders/artists but I don’t think anyone has really started working on proper blender integration yet.

The main problem is the licence AFAICT isn’t gpl compatible which means it isn’t possible to link with the leap library and without it the thing is just a fancy paperweight. I’ve heard there’s a demo for the BGE floating around somewhere but haven’t tried it out yet.

The secondary problem is I still don’t see any real use case for the thing even after playing around with the demo apps it comes with – considering how much people whine…er, are concerned with performance for things like sculpting and viewport performance having to go through python to get tracking data would simply be unacceptable for anything but a toy demo.

And linux support is still kind of buggy/incomplete which is the main reason I personally haven’t attempted to do anything with it yet – is it a leap bug, a blender bug, a bug in my code, who knows?

If/when the license thing is ironed out I have been working on some python bindings for the thing that could be plopped right into blender with a minimum of fuss and have a fairly good idea on how it could be integrated into the GHOST event system, we’ll see…

I have the leap motion and am very interested in this for object manipulation, sculpting, controlling hand armatures and so on.

One of the key issues I have with the Leap Motion is they jump the gun releasing version 1 of the software. If you have the leap motion try the new beta v2 software. It is like night and day with full skeletal tracking that is much more reliable and proves that the hardware is very capable. Leap should of waited till V2 to release the Leap motion as is generated a lot of negative pr.

One of the best features when playing around with is when it comes to blender is the tool tracking. Basically hold a tool like a number 2 pencil up to the leap motion and it recreates it on screen. The tracking of the stylus is very well done and runs with less than 5ms of lag. This will be great for being able to track a single point in 3d space. Which has been hard to do without expensive hardware or needing to use a relative motion tool like a 3Dconnexion 3d mouse.

I don’t know much about the licensing but I’ve seen others do it with V1 on youtube so it should be possible to add with a separate plugin.

My leap-motion managed to brick it’s self within ~1minute of use, (it wanted some update immediately - which started but didnt finish), now the light wont turn on and its not recognized by my system.

Just to say its not just devs being lazy, I would have liked to try this thing!
(I’ve worked on tablet, joystick & ndof integration before… its interesting)

Sorry to hear that it sounds like the firmware hardware update messed up and may of bricked it. Mine updated its frimware as soon as I installed it without issues. Have you tried returning it or sending it in for repair?

Its rare to find a device like this that can do absolute(XYZ) tracking at $80. Even with both hands I measure less than 20ms lag time. Its not perfect in tracking but I’d say its more than enough to be great for blender use. I’ve looked into hapic pens that are tracked in 3d space but those systems can run in the hundreds http://www.geomagic.com/en/products/phantom-omni/overview versus the leap motion with a simple pencil and large range.

Its great that someone is trying to integrate Leap into Blender, but I wouldn’t get my hopes up about this person’s code making it to the trunk. The author calls it a “hack and slash” method. That’s not to say I am unimpressed. Having tasted blender coding/scripting myself, I know what a challenge it is to create something like this