Leap Motion Blender integration

Hello guys,

I got my Leap Motion and I want to develop a bleder integration.

I never worked with blender or other CAD programs, because I write code, but I want to develop a blender integration (You may ask, why am I developing for it? I have my reasons^^)

The next days I want to play around with the Blender and Leap Motion Python API, but in the meanwhile, I want you to describe for me, what the plugin does. I dont mean the coding aspect, but the control aspect: how do you want to control blender with gestures?

First I want general movement: how to move around in the room.

You can also describe how to select objects and move them, but that would be only implemented later, if it is in a good way possible.

-Mator

PS: Sorry for my horrible English, but it’s late now (for me)

I’ve been playing with the leap motion sensor using the Blender game engine using @ntares’ previous effort and it works well.

The problem is that the fingers pop in and out all the time (a multi-camera solution would solve this), and the fingers don’t appear to have unique identifiers. It just tracks the finger tip points and assigns a collision object to it on a frame by frame basis.

You imagine this ability reach into the virtual scene and grab things but as you close your hand on a virtual object you lose finger tip tracking and the party is over. It feels like mocap with not enough cameras.

I also tried a little bit Leap Motion together with Blender. Here you can see my efforts:


I published the Python code and the blend file in this Blender Forum:
http://www.blendpolis.de/viewtopic.php?f=17&t=45350#p484210

I would like to discuss the programming of Leap Motion in Blender with people who are willing to show their code public.

I’m up for it if you still are. That’s what led me here in the first place.

I can’t help but thinking that this is a far less accurate method of control than good old keyboard and mouse.

Mice tend to only use 2 dimensions. That thing gives decent 3D control, plus some gesture things from tracking multiple fingers that may be hard to integrate. My hope is that not having to move the interaction plane all the time will make things easier. I imagine fine detail work will always be best on a wacom or good optical mouse, although movement resolution will always be affected by the zoom level, so who knows.

Sorry for digging up that thread but i really dug through the internet without the answer to my question…

What is the song used in your video? This one to be exactly: https://www.youtube.com/watch?v=OcysVd_X3qI

And that i’m not completely OT - i’m having lecture free time soon and i was thinking about getting myself into blender(python) plugin programming. I was thinking about a plugin for leap motion - because i could not find any good for it yet.

@MagBen so if you’re still looking for someone to discuss a little project i’d be happy to take part.

[QUOTE=MatorKaleen;2450898]
how do you want to control blender with gestures?

As far as I can understand, the LeapMotion is quite good at recognising shapes, and therefore objects. I would want to move and turn my project using my fingers, and draw using a pencil. Selecting tools and brushes works just fine with keyboard/mouse I think =)

Now what would be really cool is to combine this idea with VR-goggles! you could “see” your model floating around above your leap, and interact with it using your fingers and a pencil…this would re-define awesomeness…

I start it under git , I don’t worked so please my guest :
https://github.com/catafest/LeapMotionBlender

Can the movement be saved as an fbx?