From Brain 2 Blender

There is a new technology coming out later this year around Christmas call the Emotiv Epoch. What it promisses is the first direct brain to computer interface. Your thoughts, (some of your thoughts), will be manifested on screen. It is shipping as a game controller with three different thought detection systems, congnitiv, expressive and affective. The cognitiv suite detects 12 different motions, 6 directions and 6 rotations, plus a disappear
visualization. The expressive suite detects facial expressions and seams to be pretty extensive, ranging from horizontal eye position, eye brow position, smiling, laughing, and clenching. The third suite, affective detects emotions like boredom excitment, and frustration. It does all this based on EEG (electroencephalography) technology. It can also detect head movement with gyros. If you want to know more there are so cool promo vids at http://emotiv.com/

While this is all pretty cool stuff the best part about it is that the company Emotiv has on its website downloadable SDKs for all the suites and they have expressed interest in other markets besides gaming. I would love to see some of the awesome coders we have get their hands on those SDKs and the system and create a brain to Blender interface. With 6 directions and 6 rotations i think it could one day be a cool and handy asset to Blender, and Blender is in a unique position to take advantage of the system because Blender is modifiable by the community. I would like to see this in Blender before any other 3D package!

That’s so cool!

Attachments


I read that it doesn’t yet have the responsiveness to replace a game controller for fast action, but it’s a lot more then a gimmick and would bring life to 3D avatars.

It can read your expressions and animate a virtual avatar’s face based upon that. I get dibs on the green, blue, and yellow Dragon avatar.

That seems to be the main issue is responsiveness. Apparently it is a bit hard to use and get used to. Most EEG’s need to be attached pretty closely to the scalp to get a good reading, were as the Emotiv Epoch can be worn even on your hair, however the extra distance will obviously add some error. This is the first version of its kind so i’d give it a while before its sensitive enough to be an efficient tool.

If it could do facial capture it’d be a godsend.

I was looking at this and thinking about buying it(not sure if I will). It comes with a program called emokey that lets you set a certain thought to a keyboard key. They haven’t got the mouse control down yet so it wouldn’t be very good for blender at the moment, I don’t know any way to use it with out a mouse.

BTW Irowebot it’s spelled epoc not epoch.

CoD4 anyone?

If I’ve read correctly, it can read facial expressions (perhaps they sell a different product that reads facial expressions).

Yeah, this technology would be great in the future when it runs as fast as a person can think. Just imagine being able to make an entire scene in seconds (assuming your computer is fast enough). Of course, that’d take a lot of the skill out of it…

-Funddevi

Do you think it could be used for art? As in, thinking of a brush stroke, and it’s reflected on the screen? That’d be really kick ass, and it would GUARANTEE that I’d buy one.

So, what if you leave it on while you go to sleep? Will your computer, like, take over the world or something?

Please, let’s try to stay in this century. How many people would buy that thing? Is anyone going to spend triple digit figures on a piece of gimmickware that you need to wear on your head? What do you expect it do—import our thoughts as blender meshes? Studies show our thoughts are made of NURBS (barely supported in blender), not meshes, and they’re all in a proprietary format owned by Adobe. Sure, it might work for panning and zooming, or maybe saving your work when you fall asleep… …but aside from that it’s a total waste.

fndvi posted while I was still writing:

Heck no! You’d need a gazillion sensors to extract that kind of stuff, and besides, human thoughts are not detailed enough in the first place. They flicker and morph too fast for a whirring beige box to get a hold on them. Brains just don’t work that way.

Plus that would be a little creepy.

Quoted for truth.

Think about your thoughts for a minute. Say you’re visualizing a vehicle - a car, for example. You have the very basic form outlined in your mind. But is it really? There’s just blotches of thought, but no real tangible form, it changes to much. Then when y ou really look deeper, there are details - even simple ones like the creases between pieces of a car. Are you really visualizing those in your mind while you’re also thinking about the shape?

It just doesn’t work like that - plain and simple. The brain is far to complex for a computer to understand or read, especially based on simple actions such as facial expression or some hocus-pocus definition of mood. We can’t even understand our own brains. How could a computer do better?

a

ROFLl!!!

Please note I said in the future, it very well may be our grandchildren’s children who have this technology. I also don’t mean just think of something and it suddenly appears, I mean you think extrude, extrude, extrude, scale, scale, ctrl+r, scale and it works as you think it. Also if you could focus your thoughts correctly it should be able to be done, maybe not to the degree that I said above, but it would work.

Don’t wanna start any fights here, just wanting to argue a bit :slight_smile:

And please don’t take the vowels out of my name XD

-Funddevi

Also, a cool note: Even dreams aren’t that detailed. Ever tried reading a book in a dream? Do you remember an individual words?—because chances are what you were looking at was a blurry grey stripe. :stuck_out_tongue: In fact, that’s how some people induce lucid dreaming, they get into the habit of checking written stuff all the time, and so they stay in the habit even when they’re dreaming, and if, when they take a good stare at something, the text they’re looking at flickers or changes, then they know they’re dreaming and can control it at will.

Yeah, in its current state it’d be a complement (like a spaceball or a nostromo), not a replacement to your mouse/wacom.

Curious that a design going back to 1963 (the mouse) hasn’t found a suitable replacement yet. I use my Wacom for a lot of stuff, but I still prefer a good mouse when it comes to fraggin’

I had dreams where I was reading comic strips like Blondie and it came completely with illustrations and word bubbles with actual words in them.

Maybe my dreams are unusually detailed:spin:

By that point computers could be smart enough model things for you.

No fights, just arguments? What do you mean, uei?

Ya see, fighting is the actual punching, arguments stop right before the fighting begins :yes:

Yeah, you may be right with the computer thing, but we don’t know. Computers may have taken over the world by then, but who knows. Maybe the world blows up before then, we don’t know. Perhaps computers will be the same as they are now, but 300 million times more powerful, in which case my point would be valid, but we still don’t know :slight_smile:

-Funddevi