summer of fancy input devices

This summer I’m working to make graphics tablets work better in blender, letting the SpaceNavigator 3D mouse shine, and improving the overall experience for folks with more than a mouse and keyboard.

Given the great response to the gsoc sculpting thread (and a nudge from a forum member), I am officially soliciting input regarding… well… input! :eyebrowlift:

This is the place for your comments and suggestions. Your operating system, equipment setup, typical tasks and usage, and any rough edges or annoyances are especially welcome.

Thank you!

Mike Erwin
musician, naturalist, pixel pusher, hacker extraordinaire

Thanks for your hard work! I can’t wait till blender plays a bit nicer with the wacom. The whole UI is set up at this point for a mouse, so some UI design for tablet would be the bomb.
Hope you post what you are working on and the direction you are leaning in so as to gather input here.

Awesome question,

I actually have both of these devices, and have tried to use a Space Navigator with a Wacom Cintiq.

I found that Blender was set up to work well with the keyboard, but that the Space Navigator did not have any advantage to just leaving my hand on the keyboard, in fact by pressing ctrl or Alt (can’t remember of the top of my head which it is ) it was much easier to navigate with my pen while sculpting.

In the specific case of Sculpting/Texture Painting/Vertex Painting my preference would be this!

Side Button on space navigator = Ctrl/Alt (whichever changes the pen to rotate the view-port )
Spacenavigator up/down = Brush size
Spacenavigator rotate, or forwards/backwards etc… change the brush strength…
Other side button = something like straight line

I’m not sure of the exact items that should be mapped to what, but if you look at how the space navigator works with either sketchbook pro, or photo-shop I think it controls a number of the brush properties.

In other situations, I’m not quite sure how to use the Space navigator, as Blender is so very reliant on the keyboard for general modelling tasks. It does seem to me to make most sense in a combination with the pen, for those 3 specific tasks.

One other off the wall thought might be to use the Space Navigator for animation. i.e. tie the inputs to facial rigs so that lip sinking could be done in real time like a puppet.

Good luck with the GSOC.

I love alternate HW, have invested in too much that sits on my shelf. (although they make great dust collectors :slight_smile:

I have both an Intuos 3 tablet and a SpaceNavigator.
I think it’s fairly obvious what kinds of improvements can be added to the tablet inputs, seeing that they are so popular anyway. I’ll wait until more has been done before attempting to improve it.

The SpaceNavigator (argghh…from now on I’m calling it the ‘Puck’) on the other hand provides a new (-ish) paradigm in input and I haven’t been happy with the implementation so far.

First of all, Blender (like most programs) is designed for two kinds of input events: mice and hot-keys. This is only natural considering the HW available.

The Puck however can stream 6 different analog values at once. How does one take advantage of that?
Using it in 2.49, I noticed that it was forced to emulate key-presses or the mouse movement. At most the mouse is a 2-D device. Keys are binary. To really take advantage of the Puck’s parallel input stream may require bypassing the normal input event stream, or registering a new kind?. That might be asking for too much, but maybe not, that’s for the dev to determine.

Given the limited range/precision of the Puck, I can’t see it being used for anything except transforms and navigation. It can’t replace the keyboard nor the mouse, so what is it really good for?
One reason the mouse is king of analog interfaces is that it maps position-to-position, which is the most intuitive and easy. Touch-screens are also pos-to-pos, hence their no-brainer usage.
The puck however, with it’s limited range, is mapped position-to-velocity, which is much more squirrelly. Perhaps an attempt to map position-to-position might be made? It would be very sensitive but there might be workarounds like LP filtering.
I also found that as soon I started to rotate an object, the rotation direction would not map to the direction of my hand rotation. I know this is a classic rotation-axes-order problem, but might there be a better way than the current standard mapping of puck-axes to 3D-view-axes?

I also noticed the modal nature of the 2.49 puck behavior. Once I started some kind of movement, it was stuck that way until I hit a mouse-button or hotkey to terminate. I think it should act more like a driving wheel, which is always responsive, not modal. This also would allow two-fisted behavior like rotating an object while mouse-sculpting. Again, this might challenge the code-base, but let’s see!

Actually, a more natural input device is the track-ball. Normally I hate trackballs for mice functions, but imagine using it for position-to-position mapping of rotations only (non-modal). With natural movements of my palm I could rotate an object on the screen swiftly and surely. I’d still have the mouse in my other hand for sculpting or what-not. Blender would have to recognize which devices are controlling what, or might conflict with how the OS behaves.

I’m going to stop now, catch my breath… :slight_smile:

Thanks, shadowphile, for your lengthy post. And for pushing me to start this thread!

Actually, there’s an “NDOF” event type already in place, waiting for live SpaceNav input. That’s a technical issue I plan to tackle early. But once the input is functioning, what do we do with it?

Given the limited range/precision of the Puck, I can’t see it being used for anything except transforms and navigation.

Yeah, this is true in general. But I’m especially interested in ideas about how the SpaceNav can be useful for specific tools or situations. Scrubbing the timeline for instance.

I also found that as soon I started to rotate an object, the rotation direction would not map to the direction of my hand rotation. I know this is a classic rotation-axes-order problem, but might there be a better way than the current standard mapping of puck-axes to 3D-view-axes?

One of the 3DConnexion guys has written at great length about the proper use of the rotational data. Once that is absorbed and I’ve got a build ready, we can revisit this.

I also noticed the modal nature of the 2.49 puck behavior. Once I started some kind of movement, it was stuck that way until I hit a mouse-button or hotkey to terminate. I think it should act more like a driving wheel, which is always responsive, not modal. This also would allow two-fisted behavior like rotating an object while mouse-sculpting. Again, this might challenge the code-base, but let’s see!

That’s one of the things I’ll be working on later in the summer. Other folks have asked for the same “two-fisted” capability during sculpt. Just today I was considering how to extend this to keyboard navigation (numpad 2,4,6,8) without abruptly jerking the model around.

Thanks for your feedback.

  • Mike

Over 10 years ago I had a Logitech 6DOF device for gamers. Built like a tank. Apparently that branch was sold and became the foundation for what we have now, although the SpaceBall guys were also around, so I’m a little confused. Anyway, it looked and acted exactly like the SpaceNav, but nicer. (10 finger-tip buttons, wrist-rest, etc) I still have it, although useful drivers disappeared long ago. :frowning:

The thing I really miss with that device was a great config tool that allowed it to be mapped to almost anything: joysticks, keystrokes, mice. With that I could config for almost any app, but especially games. The SpaceNav has nothing like that! No value added.

I guess I’m saying that a nice NDOF mapping facility in Blender would then allow anybody to setup for anything using the standard keymap panels. Ultimate flexibility! It might be a good starting point anyway. I’d probably lose a week just trying it out. :slight_smile:

My suggestion:

http://www.promoson.es/tienda/images/BCF2000-WH-1.jpg

Some people use an audio fadder to control 3d animation , speacially for blend shapes. There’s also different versions of fadders: by midi and by usb, the new ones are by usb so I think would be the best option, also the price is not so high end as similar hardware that could cost a lot more, I think is a good option for animators.

SamCameron! You are awesome! That’s a great idea. I also am a digital musician so I have a huge midi mixerboard and a midi keyboard giving me 32 knobs, 24 sliders, 61 buttons, plus the actual musical keys, a modulation wheel, and a pitch wheel.

It would be ideal if Blender could have a function that let you select a blender control, then move a fader or knob or press a button on the MIDI controller, and Blender would sense the MIDI information being sent and map the midi controller to the Blender function like music apps do. My favorite application that does this very well is Reason. I’d be happy to do a demo video if you don’t know what I mean. I think you do since your signature says, “musician, naturalist, pixel pusher, hacker extraordinaire”

It would be ideal if Blender could have a function that let you select a blender control, then move a fader or knob or press a button on the MIDI controller, and Blender would sense the MIDI information being sent and map the midi controller to the Blender function like music apps do.

That’s a really cool idea, a quick way to “teach” blender how to set up custom controls. There’s about a 0.03% chance I’ll work with audio control surfaces this summer, but the same concept can be used to customize tablet buttons and touch strips.

True, true. GIMP has something like this too (and we have the benefit of seeing their source code) called dynamic keyboard shortcuts. When enabled, you can change the keyboard shortcuts for the menu items by hitting a key combination while the menu item is highlighted.

To quote the movie, Dumb and Dumber, “So, you’re telling me there’s a chance!” Sweet :slight_smile:

And since the odds seem to be one in three thousand thirty three and a third that you will do it, I’m looking into my own little solution. There’s about a 0.04% chance I’ll get it working good enough to be accepted into the trunk though.

Check this out:

This is a plugin called XB111 for Cinema 4D, that’s exactly what I mean, more information at:

http://www.xb111.de/xb_fs.html

i don’t have any fancy input devices, just a ghetto-ass Medion tablet which doesn’t seem to read properly in Blender. Feel free to hit the low-end if you get the time :wink:

What are some good uses for pen tilt information? Does blender already respond to tilt in some place I don’t know about?

  • Modulate brush properties on the fly, like any good paint program.
  • What about adjusting surface normals as if the pen was a joystick?
  • (insert your thoughts here)

tilt - it would be useful for angling the sculpting plane in brush mode.
also barrel rotation might be useful to add in also (adjusting brush rotation)

Most important is fixing the ‘brush goes straight’ due to events being dropped, which you are working on already.

I’ve always wondered how well it would work for orbiting the 3D view (should also be able to pan at the same time). Maybe even a helicopter style fly mode/navigation were the pen tilt is used like a joystick.

Python access to tilt and pressure values would be most welcome.

As far as I’m aware tilt is not used anywhere (yet :)).

You should contact Wray Bowling. Last year he did a fantastic presentation on digital puppetry using Blender and external devices; his preferred tool of choice was a PS3 controller, because since it had pressure sensitivity on every button it worked great with shape keys. He did a bunch of external coding stuff to get it working, and it would be sweet if Blender had easy access to begin with.

http://www.vimeo.com/5540681

midi events and joysticks/gamepads would be cool to be able to custom map to properties and values… just like animation drivers or booleans for other puppetry stuff!..

I have a space naigator, but it’s a bit broken… (not centered correctly and drops events on positive x axis or something…

for tablet tilt, it’d be great in the paint mode… just hook it up as another dynamic like pressure!

the biggest help for tablet users though would be a button shelf like maya and custom pie menus… that’d make one hand on the tablet, one on the space navigator much more viable!

maybe more gesture based stuff liek we used to have in 2.49… though sometimes it feels like I’m the only one that liked using them and most found them a PITA.

I think gestures were great, but they should be customizable and perhaps even stand as a sepparate key map,
or just a tickable option(to ennable/disable) in the input tab of preferences. :slight_smile:

that would be awesome

Had a look at the newly announced Asus Eee Tablet/eReader? Asus has promised a great price point (US$199). Looks like it could function as a cheap greyscale Wacom Cintiq. Cintiq 12WX is priced at $999.