I love alternate HW, have invested in too much that sits on my shelf. (although they make great dust collectors
I have both an Intuos 3 tablet and a SpaceNavigator.
I think it’s fairly obvious what kinds of improvements can be added to the tablet inputs, seeing that they are so popular anyway. I’ll wait until more has been done before attempting to improve it.
The SpaceNavigator (argghh…from now on I’m calling it the ‘Puck’) on the other hand provides a new (-ish) paradigm in input and I haven’t been happy with the implementation so far.
First of all, Blender (like most programs) is designed for two kinds of input events: mice and hot-keys. This is only natural considering the HW available.
The Puck however can stream 6 different analog values at once. How does one take advantage of that?
Using it in 2.49, I noticed that it was forced to emulate key-presses or the mouse movement. At most the mouse is a 2-D device. Keys are binary. To really take advantage of the Puck’s parallel input stream may require bypassing the normal input event stream, or registering a new kind?. That might be asking for too much, but maybe not, that’s for the dev to determine.
Given the limited range/precision of the Puck, I can’t see it being used for anything except transforms and navigation. It can’t replace the keyboard nor the mouse, so what is it really good for?
One reason the mouse is king of analog interfaces is that it maps position-to-position, which is the most intuitive and easy. Touch-screens are also pos-to-pos, hence their no-brainer usage.
The puck however, with it’s limited range, is mapped position-to-velocity, which is much more squirrelly. Perhaps an attempt to map position-to-position might be made? It would be very sensitive but there might be workarounds like LP filtering.
I also found that as soon I started to rotate an object, the rotation direction would not map to the direction of my hand rotation. I know this is a classic rotation-axes-order problem, but might there be a better way than the current standard mapping of puck-axes to 3D-view-axes?
I also noticed the modal nature of the 2.49 puck behavior. Once I started some kind of movement, it was stuck that way until I hit a mouse-button or hotkey to terminate. I think it should act more like a driving wheel, which is always responsive, not modal. This also would allow two-fisted behavior like rotating an object while mouse-sculpting. Again, this might challenge the code-base, but let’s see!
Actually, a more natural input device is the track-ball. Normally I hate trackballs for mice functions, but imagine using it for position-to-position mapping of rotations only (non-modal). With natural movements of my palm I could rotate an object on the screen swiftly and surely. I’d still have the mouse in my other hand for sculpting or what-not. Blender would have to recognize which devices are controlling what, or might conflict with how the OS behaves.
I’m going to stop now, catch my breath…