I’ve been wondering if this was possible basically since I got my Vive in 2016. It’s like the ultimate 3D controller, Imagine mapping it to a rig and controlling it real time like a 3D puppet. You could press the trigger and open the mouth, or move the eyes with the touchpad, and with Eevee, you could see the finished result in real time.
I know Blender is getting VR support, and I’ve seen this same question in the development forums. But I don’t think VR support would necessarily make this possible. I just want to use the controller as a tool not be immersed in VR.
I was watching Ian Hubert’s Blender Everyday stream yesterday, and he mentioned he was able to get Motion Tracking in real time working. I don’t know what method he used, But that got me back to thinking about the Vive. So I was searching for just “real time tracking” in blender and came across Jimmy Gunawan’s video using the Iphone’s face tracking to drive a rig in blender in real time. He was using something called AddOSC in blender. This is the first time I’ve heard about Open Sound Control (OSC), but from what I understand, It’s a protocol that can bundle together a bunch of analog signals that are addressable.
If there’s an add-on in Blender for OSC then maybe that will work for this idea. As it turns out, there are several ways to take the vive controller data and stream it over OSC. The first one I tried was TouchDesigner. I’ve also not heard a lot about TD, But it seems like a really powerful node based Python scripting platform. It has Vive settings built in, and can directly stream OSC.
It’s not ideal though, and I don’t see running 3 programs at the same time a great solution. But it works!
Ideally, I’ll get one of these “Vive to OSC” scripts running. But I haven’t figured it out yet.
It would be even better to get the vive directly connected to blender with OpenVR or something. but that’s just dreaming.
There have been a few threads recently about the OSC add-ons, but I found that in the past month @JPfeP made an amazing add-on called AddRoutes, and it even works with Midi, which I can’t wait to try out.
I got it all working, Now the problem is getting the movement of the object in blender right. It doesn’t match up 1:1. I’ve figured I’ll need to map the controller data to an empty and then drive something else with the empty. But I haven’t got very far yet in figuring out all that yet.
Anyways I would love to start a conversation about this kind of control method and get people building cool stuff with the tools they already have! This would be a really cool way to control a camera, or a character, or so many other things. I just wanted to share!
I used an empty as an intermediary, I had to switch to Quaternion rotation in Touchdesigner. Then there’s a couple corrections in Animation Nodes, which also outputs the object transformation.
It’s officially working. Now just to map out the analog button presses and such.