Today I discovered Anime Studio supports multi-touch gestures to control transformation, 2d and 3d space rotations, as well as panning the view, trasnforming layers in 2d and 3d, and so on.
And I love it - on my Intuos 5 pro this is a real time saver, I found.
So my question: has anyone ever thought of taken advantage of multi-touch gestures in Blender, or done some work on supporting this? I am also thinking of the possibilities when combining this with virtual puppets and bones being driven by multiple fingers!
Seem like a no-brainer to me.
Demonstration: