Leap Motion Support

It is my strong suggestion to implement unique support for the Upcoming Leap Motion Technology.

Blender has a unique opportunity with this technology, as it is open source and much more flexible that other 3D software. For obvious reasons to artists and techies alike, this will only be of much benefit to Blender and it’s users. The possibilities for Blender to wrap itself around this technology in a unique way are endless.

It is my request that the Blender foundation heavily engage in supporting this next generation technology. This will reap obvious benefits, from user productivity and creativity, to gaining more ground as a solid 3D Application. I know this can be done in such a way that will only benefit Blender and make it more user friendly and professional at the same time.

Getting in on this early and being one of the first 3D apps to uniquely support Leap Motion is a great opportunity that I hope is taken advantage of.

Leap Motion Website:

Leap Motion Demo:

Leap Motion Developer Sign Up:

https://www.leapmotion.com/developers

Thank you for your time.

This looks like it could give Wacom a run for it’s money

And you can back this claim how?
I am all for innovation and usually have skeptical interest rather than diffuse rejection. But I read your post and I see… “I want this in Blender”

On a sidenote, I want to get one of those :wink:

I know there are at least two devs already looking at leap.

It definatly looks interesting enough. But I’d like to see a little broader demo in the 3D aspect of it’s capabilities.

I know I probably wouldn’t use it even if Blender did integrate it, it looks very annoying to use when modelling.

Arexma,

I am assuming that one will do further research into the technology and use their own intellect to see the possibilities that this can give to the Blender community. Hence my aim at claiming this can benefit us all. My original thought was to go more in-depth about the technology, however I believe it does speak for itself when looked into, hence the “this will reap obvious benefits”. If it really came down to a personal want, I would not have a problem checking out the API and getting a few developers to implement it for my own personal use, or for solely the use of those that wanted to download an add-on to use the technology with Blender. You can click on the link to the developer page and see how easy this would be to do for personal use. My intention was not aimed at self, but was an expression of something I personally think (and was confident enough in the technology to think others would think the same) has, once again, obvious benefits. If you take the time to do your research I’m sure you’ll see to some degree what I am talking about. Out of respect to people that might have wanted to skim through the thread, I kept it simple and to the point. It was not what you have perceived it to be, nor is it the idea you have projected on to me personally that I just “want this in Blender”.

The technology is new and is promising. From a creative standpoint, the possibilities are basically what you think of, you can do (when it comes down to hand gestures, but the algorithms don’t only recognize hand gestures). You would see this had you looked into what I have proposed to this community in further depth. You seem to be assuming I have a very far fetched idea that is non applicable to Blender, that I just “think is cool” and want it in the software, when it has no practical use.

I am not an employee of Leap Motion and am not affiliated with them in any way. I am not a spokesperson for the company. And I did not intend to come across as you have perceived me to. Like I said, a dev kit is easily attained and it can easily be implemented into any piece of software. I was assuming people such as yourself could look a little bit deeper and see how this could be uniquely taken advantage of. You seem to be an intelligent person, I don’t think clicking a few more times into further links on the website I posted would be that hard for you, and I don’t think you can’t see possibilities with this technology. I simply am attempting to raise awareness of benevolent technology that when looked in to, can, like I said, reap obvious benefits. If you would take the time to look at the idea I am presenting in depth (google is your friend), and not (for whatever reason) assume I need to go over every nook and cranny of the once again obvious benefits of this, I’m sure you’ll agree to some degree as well. Creatively everyone is different, and could use it in their own unique way, I can’t speak for how certain individuals would use this when modeling in 3D View, or texture painting, or setting up nodes without having to run your mouse across the screen, that would be unfair. Technologically, I think you could figure out how this works for itself. I can’t speak on the patented algorithm that recognizes hands in 3D space, as I don’t know it, nor do I claim to. For everything else, I guess I could make a powerpoint for you, if you need some help with the math they might be using, or don’t understand how they programmed it/how programming works, if you can’t figure out how to use Google, that is.

Hopefully you now understand why I said “for obvious reasons”, I am assuming it’s obvious to those with any creative talent and/or technical skill.

I can have a powerpoint on how to use Google’s “Search” function online in about 5 minutes, however if you want to know the depths of Leap Motion, I’m sure I could figure out quite a bit on how it works in greater detail and email you the powerpoint sometime next week, as I have a few projects I am working on at the moment that are taking up ample amounts of my time.

Thanks.

Giordin

If it really came down to a personal want, I would not have a problem checking out the API and getting a few developers to implement it for my own personal use, or for solely the use of those that wanted to download an add-on to use the technology with Blender.

Wait a minute… are you saying that you personally have the resources to get LEAP support implemented into Blender, but since you think it is so “obviously useful” for everyone, the Blender Foundation should spend their resources on implementing it, instead?
It doesn’t work that way, at all. Anybody who is interested in it is invited to add support for new input devices (such as LEAP or Kinect), but the BF doesn’t have the resources for something entirely non-essential like that.
I’m not convinced yet that this device can be useful for Blender (as opposed to being a toy) - that would require thorough evaluation of its capabilities, especially its accuracy. This is exactly the kind of thing the (developer) community should work on on its own, so if you have good ideas and the means to implement them, please go ahead.
I’ve seen two people play around with the SDK so far, I believe the first problem they had is that it doesn’t support Python 3 out-of-the-box (which Blender requires).

No, that is not what I am saying.

No it’s just how you sound.

No, that is not what I am saying. You are assuming I am saying so, the second individual to assume I have immature/unthoughtful intentions in this matter. Please read my posts thoughtfully. You misquoted my words.

I am coming in from more of the view of having the BF more heavily integrated with LM technology from a marketing and technology perspective. Deep integration, heavily using new technology, in a way that when people think of LM and a 3D app to go with it, that their first thought is “Blender”. I can’t make the BF call LM and shake hands, but I can attempt to make this a much wanted feature that the BF could raise an eyebrow to. Before you tell me that I shouldn’t tell the BF how to market or what to market… I am not doing that. I am suggesting. If it will calm you down, I’ve made a living in marketing and would gladly help out with marketing. I am in no way demanding all resources from the BF go immediately to implementing this because one user of Blender with 9 posts on the BA forum says so. I am simply trying to raise awareness and would like to find others who would like to do the same. My initial post may not have made that as clear. In short, taking advantage of an opportunity. If this is not realistic, or possible, that is fine, and there is no need for quarrel. One could ask devs, or hire devs, to implement it in Blender, which could easily happen, as Blender is Open Source and anyone can program anything into it to their hearts desire, however they can’t make the BF support it heavily and market it’s support alongside Blender in a big way. Raising awareness to a higher degree and having the BF really take a look at this in a deeper way, and seeing the possibilities of the two integrating in a bigger way, can be beneficial for the Blender Community. Nothing I am saying here is meant to provoke, or cause a ruckus. It seems as if users of this forum are ready to assume the worst at any given moment or lack of understanding. There is no need for animosity.

…but the BF doesn’t have the resources for something entirely non-essential like that

You are speaking on behalf of the Blender Foundation. In your opinion it is non-essential, which is fine.

…that would require thorough evaluation of its capabilities, especially its accuracy.

" Leap’s device tracks all movement inside its force field, and is remarkably accurate, down to 0.01mm. It tracks your fingers individually, and knows the difference between your fingers and the pencil you’re holding between two of them."

Source: http://www.kurzweilai.net/a-look-inside-leap-motion-the-3d-gesture-control-thats-like-kinect-on-steroids

Feel free to do extended research. Try Google (http://www.google.com), or click on one of the three links I provided in the initial post.

This is exactly the kind of thing the (developer) community should work on on its own, so if you have good ideas and the means to implement them, please go ahead.

Thank you for your opinion. I did not say I was not going to implement any support for it on my own or with fellow developers.

Development wise I can get it up and running to some degree with my own resources, the basic things it claims to do. Beyond that, of course it would be a bigger task to get it heavily integrated into Blender in it’s own unique way. In a sense having Blender showcase it doing more than is expected. I am sure you will somehow take this last sentence as a show of disrespect towards the BF or BC, by suggesting innovation, in a respectful, optimistic, ambitious manner.

Please think thoughtfully read my posts, knowing I have benevolent intentions. If you do feel the need to pick apart and reword my words, or assume the worst about my suggestions, that is fine. If you feel the need to respond, please do so without assuming my intentions are self centered and/or unthoughtful, as this is not the case.

Thanks

Giordin

I think defining what is non essential is a bit of a grey area within Blender development. BF have been charging down the road of VFX for the last major releases producing tools which 95% of blenders user base will never use. We get camera tracking in the space of a release but it takes 5 years to get a set of decent modelling tools, whacky. And now we get OSL … Where i can spend 3 days coding a pretty spiral pattern for my shaders.

Personally for me i think things like leap motion are the future of interacting with a computer. The days of mouse and keyboard are fast becoming a thing of the past. Tablets are outselling pc’s and we need to be looking for alternatives for production tools. The creative possibilities of manipulating a model with my hands for me are really exciting.

Anything which would allow me to get away from a desk and mouse is quite frankly a superb prospect. After working with computers for 15 years my body is shot to bits. Its amazing how sitting down all day is so bad for you.

I am assuming nothing, that’s how your words arrived on my end. I did quote you verbatim.

I am in no way demanding all resources from the BF go immediately to implementing this because one user of Blender with 9 posts on the BA forum says so.

No, you aren’t saying that and neither did I. You where however using words like “strong suggestion” and requested “heavy engagement”.

You are speaking on behalf of the Blender Foundation. In your opinion it is non-essential, which is fine.

I am not speaking on behalf of the BF, I am speaking as an outside observer.

" Leap’s device tracks all movement inside its force field, and is remarkably accurate, down to 0.01mm. It tracks your fingers individually, and knows the difference between your fingers and the pencil you’re holding between two of them."

Maybe their device is capable capturing data at such resolution. The question is whether their software is capable of reliably turning that into meaningful signals, which a developer can use.

In a sense having Blender showcase it doing more than is expected. I am sure you will somehow take this last sentence as a show of disrespect towards the BF or BC, by suggesting innovation, in a respectful, optimistic, ambitious manner.

?

Please think thoughtfully read my posts, knowing I have benevolent intentions. If you do feel the need to pick apart and reword my words, or assume the worst about my suggestions, that is fine.

We probably have a different understanding of the word “benevolent”. As for your words: Consider using less of them.

You probably have a wrong picture of what the BF is doing. The VFX stuff was mostly a GSOC project. The person who did it (Sergey) was since hired part-time. Also, as opposed to bmesh, which has been written (and rewritten) from scratch predominantly by outside volunteers, the camera tracking integrates existing libraries. OSL on the other hand was part of Brechts plan (and code) for Cycles from the beginning, before he was employed (again) with the BF.
If you want to see what the few BF-employed developers are really spending their time on, you just need to look at the commit logs for Sergey, Campbell and Brecht. You’ll see that it is mostly bug-fixing and general maintenance.

The cool stuff simply has to be developed by outsiders, because if something is interesting, people will even work for free. That fact needs to be leveraged.

As for LEAP being non-essential: For now, it is an esoteric input device. Even if you added support for it, it wouldn’t allow you to do anything that you can’t already do in a different way, in the same way that sculpting or painting tools don’t require a graphics tablet.

Write less, communicate more.
You’d be a great politician.

I know Leap for a while now, actually since it was just a concept, and it’s been discussed several times in this forum already (I guess you didn’t look, just like google one can use the forum search)

Anyways. You come here, tell how beneficial a still esoteric input device would be and expect everyone to do research on his own how right you are with your claims. Your cute sarcasm practically translates to my end that everybody not seeing it has to be stupid or non creative and that the few devs we have got simply have to jump on it.

I am not yet convinced, just like Zal, how this should work great for Blender which is strongly hotkey driven with multiple operations on a single key with modifier keys…
While it’s sure fun to replace each of em with a gesture, I don’t exactly see myself working an 10h day in the studio with my hands stretched over the device, wielding spells like a wizard.
I dare you to post a youtube video stretching your arms afloat towards the screen, playing an invisible piano for an hour… :wink:
And no, you can’t rest your arms on the elbows, this would translate up/down movement of your hands into arches with the forearms, thus resulting in back/forward movement as well, which Leap3d tracks too.

The 3dConnexion devices are great too, but their use is limited, it’s great working in sculpting, texpaint when using a tablet. Other than that? Pretty useless, and I rarely use it for those either, as I use the hotkeys to manipulate my brushes, having a nice 90° angle for my arms, lowered shoulders and my hands comfy resting on the mouse and the keyboard…

Always when another motion technology is announced internet burst with rant. And always complains are baseless. I got nintendo Wii and playing zelda for 10hrs is equally tiring as using mouse when modeling and less tiring than using graphic tablet for same amount of time. It is tiring a bit but few days and anyone can accustom to use it.
Leap detects only small amount of space above so we can rest forearm on desk and use only hand. Not really annoying imho. Especially when you compare it to using wacom cintiq in vertical orientation.

Just to let you all know… I got a development kit, and I’m working on it. Actually there are four or five of us working on an add-on for leap support.

Nice! Does it work as great as the videos promise?
And looking very much forward to see your work!

@@giordin are working on it. Good times.

This thread got off on the wrong foot. I think integrating Leap 3D support, at least for sculpting and painting, makes a lot of sense. That said, for operations that do not benefit from organic motion (most of them, let’s be honest) there is no benefit. But it doesn’t take a genius to see that, possibly, adding support for this could make sculpting (especially with dyntopo) a very natural-feeling experience. I think that would be exciting, and it would be a nice improvement for certain workflows. Which is not exactly earth-shattering overall, but for someone like Michalis, or another user who sculpts a lot, maybe it would be, I dunno.

i have a leap and i think it would be very useful for painting 3d models. you can calibrate it so it points where you point on the screen it is a little buggy and almost never detects my thumbs it might just be my lighting though