OpenVR Tracker Streaming

Hi all!

As a weekend project, made an add-on to stream VR tracking data into blender for among other things, quick and dirty mocap!


Have SteamVR installed and working

Install the OpenVR python module from
***Note Blender contains its own version of python, so to have this module accessible you should install it there. For example, for me running Blender 2.82 this location was ‘C:\Program Files\Blender Foundation\Blender 2.82\2.82\python’

Install the addon:
Version 1.0: (8.0 KB)

Version 1.1: (13.1 KB)
-Initial controller support for trigger/trackpad! Example:

-a hacky attempt to maintain rotational continuity (no sudden axis flipping) which makes for easier editing of captured data

Version 1.2: (13.8 KB)
-Additional controller support for Vive menu buttons (Oculus A/X buttons), and side grip button
-Controller inputs now map to anything in the scene’s namespace (so you can target scene or world settings, for example). Requires the user to know the python path that property, but its the most flexible way i can think of for now
-Bone tracking was wonky in 1.1, that should be fixed


Step 1:
Navigate to the OpenVR tab that should now be in your 3D View’s UI panel.

Step 2:
Click Start Streaming to launch a session. Steam VR will start, and you should see a list of connected tracking devices.

Step 3:
Clicking on any tracker gives you an option below it to specify the target that it will stream to. Pick whatever object in your scene you want! For the special case of armatures, you are presented the additional option to target a specific bone.

Step 4:
You can then adjust the offset transforms of your target from the tracker (ie if the tracker orientation isn’t the way you want it, etc)

Step 4b:
For controllers, there are additional fields that allow its inputs (trackpad/buttons/etc) to drive other properties in the scene. You define them by their python path relative to the scene. Turning on python tooltips in preferences>interface and right clicking on a property and selecting ‘Copy Data Path’ can help rapidly figure out whatever path you need. Although it’s worth noting the copied data path is relative to the specific data block its in, so you still need to modify it to be relative to the scene. Some Examples:

A light’s brightness:

An bone’s x location:

The world background strength (a good example of when its helpful to use ‘Copy Data Path’!):

Step 5:
Press play on the timeline, the tracked objects should now update in realtime, and respond to controller inputs

Step 6:
If you want to record these actions, press the record button on Blender’s timeline while playing.
***Note: This doesn’t make use of keying sets, for now it just tells the add-on to set loc/rot/scale keyframes on all tracked objects/properties

Let me know how it works, and suggestions are always welcome!


thats cool, i am going to try it for sure ,
although one quick questions , is there any way to assign controllers buttons to move bones from one place to another place?

like for instance i want to drive bone using A button ,
i found this code online but i dint know how to implant it in your script, maybe it might be somewhere in puppet function

if OVRInput.Get(OVRInput.Button.One);

i have oculus rift s , let me know if i can help you in any way to improve it further


getting controller button states should be doable, the interesting challenge would be figuring out how to bind them to things in blender.

a button is likely going to return a 0 or 1 value, so there’d need to be an interface to choosing what that gets mapped to, almost like a driver setup.

Yes that would be like a setting up a driver ,
I donno how does that work with other vr systems like vive and quest , i only have rift so far , i can test it on that ,

More over if that is tough task then please help me with one button like how do i call it from open vr module ,
I m not a coder but i do little bit of coding sometime, so if i get a push start then i try to find some way around ,

What i want to do is , i will setup an action constraint for making fist , point and etc actions and then connect them with few single bones so when we select that bone and translate it to 1 then characters fingers will move from one pose to another

Then i want to assign that single bone’s value to rift controller’s button so when i press a button on rift , then it will move that bone to 1

In simple words , as you have said … something like driver setup

1 Like

I updated my first post with a version that offers some initial support for input from the controllers!

Right now its just trigger and trackpad, but it shouldn’t be too hard to add side/menu buttons in the future

You should be able to point each input at any object property that is a float value. You just need to know the path to the property. It also lets you remap the min/max of the input to whatever values make most sense for your needs

that is super cool

ok i tried it , its working ,
added location.x in the property field and it work like a charm ,
thanks a lot
i tried adding other buttons ,
i compared your script with openvr init file, sorry it might seems noobiesh
but to understand how yo get the trigger and trackpad terms ,
i did not understand much ,
i tried adding same functionality to button A
under handle controller i added this

set_target(tracker.button_A_target, tracker.button_A_property, pControllerState.rAxis[0].x, [0,1], [tracker.button_A_min, tracker.button_A_max])

then on draw input i tried this

draw_input(‘buttonA’, ‘buttonA_target’, ‘buttonA_property’, ‘buttonA_min’,‘buttonA_max’)

and save those properties under ovrtrackeritems

buttonA_target: bpy.props.StringProperty(name=‘buttonA Target’)
buttonA_property: bpy.props.StringProperty(name=‘buttonA Property’)
buttonA_min: bpy.props.FloatProperty(name=‘buttonA Range Minimum’, default=0)
buttonA_max: bpy.props.FloatProperty(name=‘buttonA Range Maximum’, default=1)

this does give me input for button A but its not working ,
we can have few more settings to complete it , like ,
grip button,
button x,y,a,b
and menu button as well ,

thank you so much for this awesome addon ,

so to do the buttons, its a little bit different. they are not from rAxis, but instead ulButtonPressed. its been a hodgepodge of online documentations trying to find the bindings in openvr, but i should be able to update to get the oculus equivalents of the A,X buttons and the side grip (although i’m testing on vive controllers so i have no way to be sure right now!).

also since i think openvr was originally designed for vive, the documentation i’ve been able to find suggests Y/B on the oculus aren’t supported because there is no equivalent on vive. hopefully having a few buttons though is better than nothing! and of course if new information comes to light on how to get Y/B or other buttons its always possible to update.

one other note, upon further contemplation, i’m pretty sure the tracking approach i was using to set tracked bone orientations has some issues (mapping world space to pose space). i should be able to sort it out although its some tricky matrix math. i’ll try to update for both things in time!

updated in the first post to version 1.2 to include grip and menu button (menu button maps to A/X on oculus). also changed how the properties are defined to something a little more flexible, although its a little less user friendly to set up. all properties are now relative to the scene:

objects[‘Your Object Name’].property

everything is working awesome , thanks man ,

i faced few issues,
like initially no matter from where i start, characters head got twisted 180 degree
so to solve that i have to rotate rig 180 degree thats quite ok , but if i rotate head from the addon panel then it behave very weird ,then character’s head starts to rotate from some different pole

same goes with the hand , suppose hands global position was 1,1,1
now in vr we see its not align correctly so we change it to 1,2,1
now location is set , but when we rotate the controller … not our hand , only controller
then it will rotate from the previous pole (1,1,1)

at the end of this video i was only rotating controller , not my entire hand , but it is taking rotation from the previous position you can see that

is there any thing that i have to keep in mind while rigging ?
because i tried lots of thing , but i never managed to get hand tracking so well,
slightly offset is ok , but in video you can see its not slight offset ,
is there anythign related to scale? i tried changing rig scale but nothing works , how do i keep my characters hand proportionate to my own hands

any pointers would be much helpful steve :slight_smile:

just to confirm, you’re using version 1.2?

i know there was some unreliable orientation issues for bones in 1.1 that i tried to address in 1.2.

outside of that, some considerations that might be helpful:

head twisting 180 degrees i think is the result of what headsets consider to be a forwards vector vs your head bone. if both hand and hands are pointing the wrong way, you can rotate your rig, but if the head and hands orient in opposite directions, thats where the offsets might be handy.

the rotational offsets are in the tracking device’s local space, not the bone’s local space (bone local space gets super weird - its rotations are relative to its parent’s rotation, so there’s no set, singular relationship between a given bone orientation and the global orientation. so essentially offsets can be a little fiddly: think of it as doing additional rotation to the tracker to get its axis to align with your bone, rather than thinking of of it as a rotation of your bone.

also, the offsets right now are applied after the tracker transformation, so if you do specify a positional offset and then rotate your controller, the bone bound to the controller will still rotate around the same point in space, just now at a distance. the offset is more intended to correct for weird orientational centres, like the vive controllers which place the rotational centre at the end of the controller, which is a distance from where your actual wrist joint is.

there’d probably need to be some additional controls to more robostly correct for proportion and scale differences to allow more flexible retargetting. right now its much more akin to a literal puppet, you can grab and manipulate parts of the puppet, but are limited by the constraints of the relative proportions of your body and puppet. the easiest solution might be a “pre” offset that manipulates the global space of the tracking data, so tracker movements themselves can be made larger/smaller, moved around, etc.

obviously having too many of these offset controls can get abstract to work with. i’d have to think on whether there is a more artist friendly way to get everything lined up nicely.

p.s. if your rig is share-able, i can take a look to see how i might line it up using the offsets (or see if it reveals any new bugs in my code!)

, okey ill massage you my file soon , thanks for the explanation man, rotation and location and all that local and global space creating lots of issues , but in the end good thing is , its working well , whatever we have is working really well ,

1 Like

Does this support Valve Index Controllers? I have a pair of them and I am curious If they are supported :slight_smile:

1 Like

Question, so where do I install the Version 1.2?

1 Like

I am very interested in this thread.
I would like to see this all explained in a video. A step by step video.
Even the hardware specs you’d need (model and version) for the VR stuff (what oculus model and controllers does it recognizes), etc…
Fantastic post thread.

1 Like

@HeadClot I only have vive controllers to test with currently and getting a feel for how to translate the openvr api. likely at least some of the mappings should be shared with other input devices, and since valve makes both the index and openvr, full support is surely in there somewhere!

@Sean_Lake1 Installation should be the same as any other addon, are you familiar with that process?

@DavidRivera, yup i’m definitely intending to do that! Based on @draguu 's initial testing, i’m first trying to think of ways to improve the user experience for seeing how the tracking data is translating to objects in the scene.

i’ve had a few thoughts on how to tackle this, including potentially a node-based solution which should be fun since i haven’t tried making custom nodes before! :smiley:

1 Like

Hi Shteeve!

Yeah, usually to install there’s a zip file that I just point my Blender to install. I’ve not had to install .py (python) directly before. That’s why I asked :slight_smile: Thanks.

Ok, so I guess I learned something :). I can install just .py from the install addon :). Ok, however ow when I press the Streamnow button from the new UI, I get an error:

Sorry to be a pain.

Thanks in advance.

1 Like

you will have to have your controllers and device ready, order for it to run properly,
i suppose you are using steam? right ?

1 Like

yeah from looking at that error, it seems like it happens in the openvr library itself (which it does look like you have installed properly), so yeah i wonder if there is an issue with it launching steamvr?