Animation Prep Studio - Free VR mocap for Blender!

Animation Prep Studio - VR motion capture story builder (optimized for Blender).

A few years ago I was trying to create some VR experience with humanoid animations for training simulators. But I found it was difficult using a Kinect and other software, So I tried a few StreamVR based mocap applications and still found even a single take required many multiple post processing steps such as re-targeting, tracking smoothing, finger animations, animating facial blendshapes and visimes and many other steps.

I felt I could design a single tool (and workflow) to eliminate many steps while using a game engine to automatically and simultaneously (and in an enjoyable way) record avatars and props together in real scenes for true immersion. For a couple of years now I have been developing a tool which provides users the ability to combine their consumer VR body trackers (or Kinect + Driver4VR) and their HMDs to a verity of sensor fusion integrations to creat fluid and natural recordings using simultaneous head and body tracking, finger tracking from Knuckles or mocap gloves, facecap, visimes and gaze tracking all in one environment to provide the most natural results. Includes other abilities such as ragdoll physics and joints to add effects or allow editing or interacting with multi-avatar scenes. Combines much of the necessary automation and features into a single workflow to help negate many of the mundane tasks of animating and allow artists to focus on creating instead of repeating.

The term “getting into character” may apply here as literally connecting one’s self to an avatar as completely as possible and then looking into a (VR) mirror while acting out a script. And I totally added some Unity projects on Github that let users include their own custom avatars, props and scenes just for this purpose!

What problem does this solve that many people have?
I originally developed this tool to solve a need I had to create some VR training videos, but since has matured into a functional story telling system with the ability to export entire compilations as standard filetypes allowing applications such as Blender to import the recorded media. This is useful for anyone who would like to create VR storyboards or experiences, or for character modelers who would like to “become” their creations (and fully supports FuseCC, Mixamo, Daz3D, ReallusionCC3 and Makehuman).

It is also tailored to users who would like a means to quickly add character animations into a Blender movie or short film (using the included SceneLoader.blend automation script). And can easily be used to generate .fbx files for use in Unity games as well.

What does it feel like to use it?
The user experience ranges, depending on if wearing feet and hip trackers (optional elbows, knees, chest) you may achieve an enhanced tracking experience as the avatar may be connected to the player’s 11 points of body tracking, Even the VivePro Eye is fully supported for gaze and blink tracking. Extra trackers or not It’s still lots of fun, and being able to record then immediately playback in VR has many interesting uses, while being able to create stories and add multiple characters (and props) can be extremely entertaining! :)

Current Release:
APS_LOGO_GRAPHIC_ALPHA AnimationPrepStudio (Lite) - Version: 2.4.4
- Ask for permission to view the Drive!!
- Artists are invited to join the APS Discord

Key highlights:

  • Uses Unity3D™ engine based gaming environment in conjunction with SteamVR™ full-body
    tracking to record, playback and edit captured animation data with tools for rapid compilation.
  • Using HTC™ Vive Trackers (Up to 10 points) for optional full body tracking.
  • SteamVR Knuckles compatible for individual finger control (default controllers also work).
  • Vive Pro Eye support for eye and blink tracking.
  • Actor, Director, Cameraman and many other virtual roles may be fulfilled by a single user.
  • Audio recording and character lip-syncing.
  • Facial expression recording tools/editing tools (no additional hardware required).
  • Includes default avatars, props and worlds to get your project rolling immediately.
  • Create avatars from Reallusion CC3 or Makehuman™ .mhx2 plugin and Blender - See: Creating Custom Avatars.
  • Create custom props and worlds for environment immersion during recordings.
  • Create “story pages” and replay entire compilations on a page-by-page basis.
  • Includes ability to export entire scenes to Blender™ and automatically generate a scenebuilder.blend file to load the compilation as a Blender Cycles (or EEVEE) project instantly!

Custom Avatar Builder Project - Add custom Reallusion/Makehuman models from .blend files.


Reserved For Future.

1 Like

Just added 4-D curves feature to assist in creating fight animations:

1 Like

Reserved For Future Post.

Frame Rate - control
I have just added a “ Frame Rate ” (fps) control.

Projectile’s Mass - firearm prop control

Now recorded characters can be knocked back (using force) when shot:

1 Like

Hey, this is looking pretty cool, and pretty helpful as well. One thing we were wondering as we are doing the first testing of the software is if it’s possible with the exported blend files to also export empties for each of the trackers, controller and headsets raw position and rotation data as well in there. Thanks for the great work!

Hi ReelCaptivation, thank you for reaching out and being so kind. And welcome to BlenderArtists! :slightly_smiling_face:

I like your idea for allowing users to record and export controllers, trackers or HMD raw data (as simple props) for blender. I have just added to the list of todos and should not be very difficult to implement. Thanks for the suggestion!

I am currently adding many new features planned for next release; I apologize if it takes a few weeks to get to this but I will attempt adding this feature soon.

1 Like

That’s great to hear! And no need to apologize, development takes time. Thanks for being receptive to some new feature request.


Dude, I had to make an account to thank you for your work. I made a silly video using your software, and although I still have a lot to learn about it, I had a blast using it. Here’s a link to the video and you are free to use it however you want (though I do curse a lot in it):


Hello Blended_Blue,
Thanks for making the software.
I have a problem with importing a custom avatar - in this case a ybot. I’ve tried several times thinking I’m screwing something up, but they all seem to be too short when judging by my hand position vs head position (eg have to reach way above my head to touch the neck of my imported avatar), and I have to max out the slider in APS to get it remotely close. I’ve also tried scaling the avatar up/down before converting and that doesn’t seem to have any affect. Do you or anyone else have any ideas? Heck, I’d even take the ybot you created with the top-hat in the hopes I can figure out what I did wrong.

Also, I have a few suggestions:

  1. Controller position/rotation offset. It’d be nice if I could adjust the position of the avatar’s hands relative to the controller. Thus allowing for “palm to palm” contact between hands without controllers getting in the way. This would also allow rotational corrections to the direction of the index finger - as currently, with index controllers, the index finger in real life and in APS do not match up.

  2. BVH file export naming. I just take the automatically named humanoid_x.bvh file and import it into blender directly and export it with other bvh files as an fbx file. So it’d be nice if it named the bvh files according to what they’re saved as in APS.

  3. Max Height adjustment? If I can’t fix the import issue with the custom avatar, it’d be nice to have an even greater max height value. I must somehow make the position of my head/controllers relatively accurate.

My personal project involves sign language, so finger/hand positions (especially in relation to hand/body/head) is essential.

Hey, I am sorry that you are having trouble with avatars.

At the moment we are working on redesigning the builders into a single APK (as an importable unityasset) and includes many fixes and improvement. I hope that this release address the issue you are experiencing.

Pardon me, how do we install this? I didn’t see it described in the original post. Also;
does this support Valve’s Index controllers?

To install APS runtime (VR mocap recorder) you must first download and extract, then run AnimationPrepStudio_Lite.exe. For Windows only.

Also SteamVR must be installed and running.

And yes Index controllers (Knuckles) are supported. I have also added bindings for Vive controllers, Vive trackers and Oculus touch controllers.

Please note: If you are using any Vive trackers then be sure to set all tracker’s role as: “Disabled” (located in SteamVR’s Manage Trackers menu):

If you need assistance installing asset or avatar builders I would be glad to help also.

Does the prop builder support props with moving parts? Like you can grab a part and move it while holding it, and holding one prop with more than one hand?

Because the prop builder is a simple Unity project you may add Animation components to include motion on the prop. Simple Unity animations should work, particle systems should work, hinges and joints might work … added rigid bodies is not recommended.

only the prop’s position and rotation is recorded so none of the additional animations would be exported to Blender.

And yes dual hand grabbing is supported on all props.

I see. Anyway I could add it so that additional movements, like say moving the magazine were it a bone as part of the prop for example would get exported to Blender? I’m thinking of trying to use this to make animations for a FPS game.

To accomplish that would require two props so they can both be recorded as individual animations, one prop attached to the other - then begin recording once grabbed or unattached. It sounds like a nice idea for a feature, currently this is not supported, but possible I could add similar functionality after next update…