So… all this talk about conecting the Wiimote to your PC got me thinking that it would be neat if you could build your own motion capture suit out of them similar to (but worse & much cheaper) this:
And integrate it into blender?
Six wiimotes should be enough, given that you set up a clever IK setup first (2x lower arm, 2x lower leg, 1x body, 1x head) and they cost about 30-40 euros = Motioncaptureing for 200 euros
There already is a GPLed library that takes all the input from a standard bluetooth device: http://wiiuse.net/
Bump.
Actually it would be even cheaper since the nunchuck also has a acceleratometer build in, and you can get a wiimote plus nunchuck for as cheap as 25 euros from Hong-kong.
Motioncapture?
Why not try Eyesweb? http://www.infomus.dist.unige.it/EywMain.html
It’s free, but you need a webcam.
Before, my dad had a game project where they used eyesweb to capture the motion wich allowed you to steer the character in the game. http://www.speech.kth.se/music/projects/Ghostgame/
There it is
Maybe you can ask him about how to do it…
I think it’s possible to set up like 2 or 3 cameras so you get capture from all directions.
Hmm, I have to check it out more in detail… but isn’t that more like a PlayStation 2 “Eyetoy” software, that analyzes videofootage for specific motion?
What I am trying to archive is motion capture to animate a rig for a character.
Yes it’s like eyetoy, but it can be used to steer a character.
A rig? You mean like transfer it directly to your character?
I think the weemote thing is a good idea, but is it able to like, “figure out” where you are with only bluetooth? I think the wiimote only register movements not where it actually is at the moment (when only bluetooth is used).
Yes I would like to use it to create the animations directly, as done with those motion capture suits I linked above.
About the positional information:
By measuring how long the acceleration is applied to a certain direction, you can figure out the relative position to a predefined calibration pose, thus having the exact position.
But this is only the theory
In reality you probably have to fight with quite a bit of signal noise
Edit: I started another discussion over at Moddb:
But thus far the general opinion seems to be that it is not worth the try since the signal to noise ratio will be too high
aha okey
Doesn’t sound that good.
But I can only recommend you to try eyesweb, I mean, it’s free.
If you buy a bunch of wiimotes and it is not even working OK, it’s another thing…
Otherwise I doesn’t have so much to come with
EDIT: WOW, I just read in the other forum, $500 for just one camera is… “pretty” much
Well, for starting a single wiimote would be enough to test the movement of an arm.
And maybe there is a coder interested in this who has a wii already?
Ah I don’t think the wiimote is good enough for that. (I’ve hooked one up to a computer before) The accelerometers aren’t that good enough to give a semi accurately correct position for very long. Nintendo new this, so when ever you seelaser pointer style stuff on screen Nintendo is using the two inferred points on the sensor bar and the camera on the end of the wiimote to triangulate it’s position much more accurately. Not to even mention the wiimote has no way of telling yaw other then inferred. But I have thought about how I could use a wiimote to do stuff with blender and my thoughts are this: “Don’t move the wiimote, move the sensor bar”. That way one could say have a camera recording a live feed of video with the wiimote almost on top of it. While you stand in front of the camera swinging a sword with one inferred led at the base of the blade and one at the end. All the wiimote sees is the to points. The regular camera see everything. This data gets sent to blender where it is lined up to match each other. Blender takes the two points and draws a line across them which gets turned into a light saber blade using the plugin in the vse. And this blade gets pasted over the video feed.
Edit: I haven’t tried this yet because I’m at lack of computers running blender.
Edit2:The signal to noise ratio shouldn’t be a problem but keep in mind an accelerometer is just a rotation sensor with a weight hanging from it so it will have errors because the weight continues to swing after the motion stops a the correction systems for this can make it seem jerky without smoothing.
I find this an interesting thread. I’m off to try a few things. I’m going to see if the wiimote thing just won’t work. What if you had two wiimotes looking at you (your wearing a cheap inferred suit. [A suit with inferred leds imiting a constant beam of inferred.] ) From two different angles and the model is lining up it’s vertices to the same points from the same angles. I don’t think I explained that very well but keep that in mind. I WILL BE BACK!
Ok, I found out that what I was talking about with the light saber not long ago is possible and I would say probable. I’m not sure about the suit thing but I think that will work too. Only thing is I found that the wiimote can only track 4 dots at a time so the suit might have to have more then one way of motion tracking. (such as putting green dots on a person and then green screening them to make points, since it doesn’t matter what this person looks like for the character animation thing.) Anyways back to the light saber thing. First of all this is possible but not easy in real time so I won’t try until I think it’s easier or I know python which is what I think it takes. I intend to draw the light saber blade over the sword with the leds, this will make it so you never see the leds or the blade. To do this I use the video sequence editor with this plugin. That plugin uses an image and adds a glow effect to it which can be AlphaOvered another video. You can see an image of this here. So I need to get the image from the wiimote of the two dots and feed a line through them…well there’s a small problem there. At WiiLi.org I found this under the inferred part of the wiimote specs:
IR Sensor
During R&D, Nintendo discovered the motion sensors were not accurate enough to use the remote to control an on-screen cursor. To correct this, they augmented the remote with an infrared image sensor on the front designed to locate two IR beacons within the controller’s field of view. The beacons are housed within a device misleadingly called the Sensor Bar.
These two sources of IR light are tracked by a PixArt sensor in the front of the Wiimote housing. By tracking the locations of these two points in the sensors 2D field of view, the system can derive more accurate pointing information. Not much is known about this feature yet, but circumstantial evidence from the Nintendo/PixArt press release suggests that Nintendo is using a PixArt System-on-a-Chip to process the images on-board the Wiimote and sends the minimum information needed for tracking back to the base unit. Transmitting full 2D images constantly would require a prohibitive amount of bandwidth, especially when multiple remotes are in use.
So that means no image…but I can make one…from the raw data. Here I find this script for glove pie the windows wiimote “driver”. It says this:
Nunchuk data: gx, gy, gz, RelAccX, RelAccY, RelAccZ, Pitch, Roll
Controls:
To begin recording data click the Wiimote’s A button
Click the Wiimote’s A button to pause/resume recording data to file
Use the Wiimote’s D-pad to change values displayed in the debug window
Hold down the Wiimote’s Home button for 1 second to exit the script
…
Using the Data:
This script outputs a comma-separated values(.csv) file to output.txt under the GlovePIE directory
After acquiring your data close GlovePIE
Rename the output.txt file to whatever.csv
Open the file with your favourite spreadsheet program (Microsoft Excel/OpenOffice.org Calc etc.)
Create graphs & analyze your data
Make new scripts, containing motion controls, for your favourite games or applications
I can use OpenOffice.org to graph the x and y data and make lines which can be exported as pictures. Those picture will then be used with blender but might see a trip to virtualdub to be made into a video file. I haven’t tested any of this yet because I’m at lack of computers with blender running on them. Right now I only have one computer and I can only sometimes get on it. On the bright side it is a fast computer and I can work fast on it when I get a chance. I’m not sure how to do this on linux because I’ve only been able to do bluetooth stuff on windows, so sorry any linux only users who might of wanted to try this.
That was a lot of typing for one thread that I don’t host.
Sounds interesting, but I would imagine that the wii infrared “camera” is pretty low in resolution, right?
And what would be the benefit compared to a regular camera (with greenscreening etc)?
For most case that would be true, you would get the same thing or better from the green screen process but in a light saber movie you have lighting issues because the color that the blade comes out might be off as in ether to dark or something like it. On a normal green screen you can have a light shining down on the screen to make sure the actor comes out with good contrast from the screen. This also doesn’t effect the color of the actor. You could do this with a light saber by buying one of those lighting up ones but then you start to run out of options for homemade light sabers. That would be a issue for me. Another possible issue is the color of the environment. If another object the same color as the blade crosses eye line with the blade your going to have a LOT of fun fixing it. And then the wiimote blocks out everything but the two point on the blade. Also there is no way of knowing what resolution from the wiimote would have been because the data comes as coordinates on an X & Y plane for the two points. (point1 = [X=4 Y=6] point2 = [X=-2 Y=5]… only more like: 4,6,-2,5…) I don’t think the inferred sensor is very low res also because it has no problem of telling where the points are don’t to almost laser accuracy. I think it works like a touch screen on a ds only it can track up to four points. (while many points are pressed down on the touch screen it finds the center of them all and makes that the point that shows up on screen.) Because it is only data OpenOffice.org decides what resolution it turns out. So over all the benefit is it crowds out everything else but the blades so you don’t have to fix anything. It’s just got less variables.
Interesting, I wonder what sort of accuracy you could get out of a system like this. Here’s another possibility, what about using two wii remotes to get depth information (think the stereo camera system installed in your head)?. I have this mental image of posing a character by reaching out into thin air and pushing his arms and legs around.
Well the best answer so far is just to cut n’ paste existing mocaps to get what you want. Once you do it for a while, you get up to speed . . . I should be there in like 10 years.
The idea behind the program i wrote was to determine depth of an IR souce using the basic principles of binacular vision. The method i used isn’t accurate. The scale of X, Y and Z arn’t really related. The reason is simply cause i wanted to get the thing built as fast as i could. If you really wanted to do it with just 2 wiimotes, their is a better way.
Read the data from wiimote 1. Translate the data into vectors intersecting with the position in space of the first wiimote. You would have as many vectors are their are ir sources. It’s hard to explain how to do this without having seen how the wiimote ouputs data, but you basically break it down into the vector components… it would be easy to do.
Repeat the same procedure for wiimote 2. You now have two sets of vectors. You can find the smallest distance between two vectors in space using the crossproduct + some algebra. Also not very hard to do. Youd take the cross product of all permutation of your two sets of vectors. youd keep the smallets vectors for each of the IR sources. (if you had 4 IR sources, youd find 16 vectors, and keep the smallest 4 vectors). Using these vectors you can approximate (fairly accurately) the position of all IR sources. What you have to watch out for is occlusion… ie, if you block out one of the ir sources from one of hte wiimotes, you will make them unhappy.
I’m working to improve my program at the moment, but my aim is a game, not motion capture. I’m sure if someone were inclined they could build something for around 200 dollars. (2 wiimotes = 100, bluetooth adapter = 50, 50 dollars for IR leds+soldering equipement+hardware). The data it would output would also be to scale with the real world (ie, when you set the position of your original vectors, if you use the actually distance between the wiimotes as a variable, all data outputed would be to the same scale. Easiest thing to do would be to set the wiimotes a meter appart, and 1 unit appart in the program… then everything is in meters)