There is a new technology coming out later this year around Christmas call the Emotiv Epoch. What it promisses is the first direct brain to computer interface. Your thoughts, (some of your thoughts), will be manifested on screen. It is shipping as a game controller with three different thought detection systems, congnitiv, expressive and affective. The cognitiv suite detects 12 different motions, 6 directions and 6 rotations, plus a disappear
visualization. The expressive suite detects facial expressions and seams to be pretty extensive, ranging from horizontal eye position, eye brow position, smiling, laughing, and clenching. The third suite, affective detects emotions like boredom excitment, and frustration. It does all this based on EEG (electroencephalography) technology. It can also detect head movement with gyros. If you want to know more there are so cool promo vids at http://emotiv.com/
While this is all pretty cool stuff the best part about it is that the company Emotiv has on its website downloadable SDKs for all the suites and they have expressed interest in other markets besides gaming. I would love to see some of the awesome coders we have get their hands on those SDKs and the system and create a brain to Blender interface. With 6 directions and 6 rotations i think it could one day be a cool and handy asset to Blender, and Blender is in a unique position to take advantage of the system because Blender is modifiable by the community. I would like to see this in Blender before any other 3D package!