Razer Hydra + Blender = Sword Control demo

I just got a razer hydra, and started learning to code python 6 days ago. I made a spaceship v spaceship game that isn’t worth showing yet, and this tech demo today.

It’s just a test of the control scheme with some physics… but it’s a step up to being able to have a character runnin’ around with a shield and a sword, dual wielding guns, bows, fists, spears, grenades or whatever else you would care to launch at an enemy.

With the proper scaling, I’ll actually be able to have a 1:1 relationship between my arm movement and what I see on screen, making for a rather immersive experience where you wouldn’t always need to see your self on screen because you become your character… at this point, you could shoot enemies behind yourself using a rear view mirror or surround sound.

Anyhow, I’m going to try to make a few very little games - between 3 and 5 that allow for super fun motion control.

It’s super new and exciting and I’d love to hear your ideas for what we could do with motion control gaming.

wow razer hydra is amazing and your work too
game like sword fight or Boxing would be fun.

Yeah, this is really quite nice. What are you using to interface with the controller from the BGE?

It would be great to see if you can use the hydra to properly control my ragdoll setup :smiley:

I really find it interesting to see all the development on motion controlling (wii, kinect, PSmove, Razor hydra, what’s next?) and hope to eventually implement something myself.

So after 6 days you can do this?! Wow, very impressive! I love this kinda’ stuff. :slight_smile:

Dude! That would be awesome! I haven’t got any idea how to implement it, but I’m sure if you’d be able to help me a bit - maybe on skype screencast or something, we could make that work.

I would love to have ragdoll robots (because metal is easier to make realistic ingame) that would come attack in waves, and you’d get to slash ‘n’ hack and break 'em apart using swords, fists, and other motion control weapons.

I only started python / bge on Monday, so I don’t know a lot, but I learn fast. I have been using blender and other 3D tools for 10 years now, so I know the modeling and graphics side quite well.

I originally thought the razer hydra would work as a 6 axis joystick, but it doesn’t. An unofficial app was released that gives you up to 9 joystick channels per controllerhere.

I do not really see why robots would be so much simpler. Regarding the realism I would probably give an advanced combat robot a gun instead of a sword :stuck_out_tongue:

That sounds simple :cool:, so Blender recognizes all 18 axes? Can you test the attached blend and tell which axes number maps to what in which unit (degrees, meters, normalized…)

I am tempted to see if I can hack into my fencing game and put some hydra controls in so you can test with actually some AI to fight :evilgrin: (that is if Blender 2.49b also supports 18 axes, just check if .blend works)

So probably somewhere in the weekend I will have some time to check in more detail…


joystick_axes.blend (129 KB)

I want to make the game relatively small, and very highly polished. A human / zombie + damage that looks as good as I could make a robot would be extremely difficult. I understand the limitations of bge, and want to work within them to make as real looking game as I can. If that means fighting in mostly dark hallways with robots who carry swords (swordbots!!) then that works for me.

Robots wouldn’t require deforms and when they break, they’d break into pieces, no blood and guys required.

Anyways, the wrapper splits the hydra up into 4 virtual joysticks… and blender although documented to pull in 8 full or 16 partial axis, can only bring in 6 per channel. Thus, I have each sword using two channels. Left is 2+4, right is 3+5. The hydra can wrap it’s trigger into a joystick, and it’s D pad on top is also an analog stick… so 9 channels per hand. (and 7 buttons)
You can assign these to any joystick channel ppjoy (also required) can accept. Here’s my current config, based on the default joystick wrapper’s settings.

#pull the proper joystick channel out of the list, 
    #and assign it to the property control channel and give it a range of 1-10
    px = (joy.axisValues[0]/10922.666) #side to side max +/- 10
    py = (joy.axisValues[1]/10922.666) #front 2 back
    pz = (joy.axisValues[2]/10922.666) #up 'n' down 
    #grabs the rotation, turns it into 360 degrees and then into radians =)
    rx = math.radians(joy.axisValues[3]/182.0444*-1)
    ry = math.radians(joy.axisValues[5]/182.0444*-1)
    rz = math.radians(joy.axisValues[4]/182.0444*-1) #steering

Also, I am not currently pulling in the trigger and d-pad
Here’s how I generally assign it.
padx = 0
pady = 1
trig = 2

I’m using blender 2.63.

So basically, each hydra runs into a python script with two joystick inputs set to “axis” and “all” so I grab every channel.

Anyhow, this would be pretty rad to get to fight somethin’ with AI =)


I’ve got a playable demo up on the internet for anybody who wants to play with it or see the code! So much fun!!! =)

I’ve dome some major updates - and released beta 9.

Download -http://www.bricklightstudios.com/motioncontrol/

Wow, this is truly amazing :smiley:

Not much more to say :stuck_out_tongue:

WOW, wow and wow again, this is amazing!

Nice show of some of the stuff you can do with accurate motion control, I now really miss 3D vision to estimate the distances. And the hydra shape is a bit odd to swing with, I have a much better grip if I hold the shaft but then you can not use the buttons. So that makes it quite hard to actually hit the tofu. Also somehow the textures of the alley wall and tofu did not load correctly (so sometimes it is even hard to see the tofu coming), while the posters etc did find the correct textures :spin:…

Thanks Guys!!! =)

I’ve found that it takes a while to get used to playing it, and after a while, you get used to the control scheme, and can get pretty accurate.

I think that having different sized tofu makes it harder to play in 2D, because your main depth cue is size. I might put all the tofu back to a standard size.

As far as textures not loading in, I’m not sure what’s with that. They all worked for me in the file I downloaded.
You’ve got Blender 2.64, correct?

Yes 2.64a, could it be that some textures use a relative path and others don’t?

I was also wondering if the camera matches your head position relative to the arms, I got the impression the aiming was easier when I calibrated the system a bit lower. So not with arms stretched outside from the shoulders but with the hands lowered a bit.

The camera is level with the center of the bat control zone. If you hold your hands at head level it should calibrate properly.
I’m sure there’s some fancy way to make the camera fov match your fov based on your screen size and viewing distance, and it might make the game easier and help you feel more connected with your bats.

Also, I did have textures that weren’t pointing to reletave locations “//textures*.png” - which will be fixed in the next release. It’s about 10 different textures.

I’ll have a release next week hopefully. =)