Integrating with your real hands in blender

Hi, first of I know there already exist some other threads about using Leap Motion to integrate with blender, but I post this in a new thread since I feel that my intentions may be a little bit different the others.
Also, prepare yourself for a long journey through the encyclopedia of my idea. And when you’re done you will never be the same…

So, I’m having an idea of a blender addon/plug-in that by using a motion capture device (like Kinect or Leap Motion, and a motion capture addon (like Bloop or Brekel) you can integrate your hands with blenders work area.
More specifically you would have some 3d hands in 3d space, that moves like your hands in reall life.
With these hands you can select objects by clicking on them, grab to move them, rotate them, edit them as you would if you wer touching them in real life. Sounds nice right ;).

As I mentioned, there already exists motion capture addon/plug-ins that can use a kinect sensor. So if the “3d hands” is some normal rigged hand meshes that is controlled with the motion capture addon, the “hands” addon part would just be to find meanings in the handmovements. Like grabbing or touching an object.

Since I have no coding experience theese are just ideas, also a reason I make a new thread, and I hope someone who have the coding experience would like my ideas.

To my ideas for the hands.
First of, theese hands would pass through objects without colliding with them. But all movements and touch of objects is detected.

I was thinking that you could make commands by making gestures with the hands. The gestures would generally be how your fingers relate to eachother, thumbtop against indexfingertop for example. Everyone should be able to specify their own commands, decide that Thumb+indexfinger = Rightclick + shift for example.
Gestures could also be how you basically are holding your hand, like making a pointing movement.

If you want to move something, the gesture could be a “grabing” gesture. Althogh it should be simplified, instead of having to use all your fingers it would be enough to grab it with just some. Also this would be possible for users to change for their behaviors, they "record " a gesture and type in the command and a margin for how similar the detected gestures should be the recorded one.
You would be able to specify the commands and gestures per hand, to get more command possibilities.
Something that I think (I’m not sure since I dont code) is to specify a moving gesture, such as moving your indexfinger and thumb outwards from eachother when you want to scale up something. If it’s possible then it would be a great effert to the already-just-theoretical-hand (but I REALLY hope it may be reality)

Perhaps you would be able to parent bones and objects to fingerbones, it would be like a mini-motion capture.

You would be able to orbit, move and zoom in 3d space with the hands, using some specified gesture.

Now, how should you edit objects? Well, instead of having to change every objects mode at a time, which would be time consuming, you would be able to change the hands modes. If their mode is edit-mode every object switch to edit mode. This would also be controlled with some gesture.
You would have to specify gesture commands per mode, there would be different gesture commands in different modes.

Since there are so many hotkeys and shortcuts it would be easy to controll for example brushes in sculpt mode. If you have specified a gesture as “C” you could change your brush to Clay brush etc.
When sculpting you would draw your finger alongside the model, as in real life!

Perhaps you could move a 2d cursor to, by pointing on the screen or so.

Now I think that theese are my main ideas. I know it’s much but as I said it’s ideas, and it would amazing if just something small came out of this.
Although, imagine the use of theese hands, being able to controll objects realistically.
I love it! :eyebrowlift::eyebrowlift::eyebrowlift: