Mocap Fusion - Free VR mocap for Blender!

VR motion capture sandbox (optimized for Blender)

www.MocapFusion.com

I create motion capture games for artists. Here’s one of my free games that was created for all Blender artists!! :slight_smile:
If you have a VR headset then you can create stunning animations in minutes using the bundled automation to create blender renders using your own avatars!!
Free! Please support me on Patreon! Thanks!!


The term “getting into character” may apply here as literally connecting one’s self to an avatar as completely as possible and then looking into a (VR) mirror while acting out a script. And I totally added some Unity projects on Github that let users include their own custom avatars, props and scenes just for this purpose!

What problem does this solve that many people have?
I originally developed this tool to solve a need I had to create some VR training videos, but since has matured into a functional story telling system with the ability to export entire compilations as standard filetypes allowing applications such as Blender to import the recorded media. This is useful for anyone who would like to create VR storyboards or experiences, or for character modelers who would like to “become” their creations (and fully supports FuseCC, Mixamo, Daz3D, ReallusionCC3 and Makehuman).

It is also tailored to users who would like a means to quickly add character animations into a Blender movie or short film (using the included SceneLoader.blend automation script). And can easily be used to generate .fbx files for use in Unity games as well.

What does it feel like to use it?
The user experience ranges, depending on if wearing feet and hip trackers (optional elbows, knees, chest) you may achieve an enhanced tracking experience as the avatar may be connected to the player’s 11 points of body tracking, Even the VivePro Eye is fully supported for gaze and blink tracking. Extra trackers or not It’s still lots of fun, and being able to record then immediately playback in VR has many interesting uses, while being able to create stories and add multiple characters (and props) can be extremely entertaining! :)


Current Release:
APS_LOGO_GRAPHIC_ALPHA APS LUXOR (beta) - Version: 3.3.6

- Artists are invited to join the APS Discord


Capabilities:

  • Export mocap and create scenes in Blender™ instantly.
  • HTC™ Vive Trackers (Up to 11 optional points) full body tracking.
  • Ability to record, playback, pause, slomo, scrub mocap in VR.
  • Customizable IK profiles and avatar parameters.
  • SteamVR Knuckles support for individual finger articulation.
  • Quest 2 optical finger tracking app for individual finger articulation and finger separation.
  • Vive Pro Eye blink and gaze tracking support.
  • Sidekick IOS Face capture app (Truedepth markerless AR facial tracking).
  • User customizable Worlds, Avatar and Props may be built for mocap using the APS_SDK.
  • Compatible with existing Unity3D™ avatars and environments.
  • Supports custom shaders on mocap avatars.
  • DynamicBone support for adding hair, clothing and body physics simulation to avatars.
  • Breathing simulation for added chest animation.
  • Add/Record/Export VR Cameras for realistic camera mocap (eg. VR Cameraman effect).
  • Place “streaming” cameras for livestreaming avatars to OBS or as desktop overlays.
  • Microphone audio recording with lip-sync visemes and recordable jaw bone rotation.
  • Storyboard mode, save mocap experiences as pages for replaying or editing later.
  • Animatic video player, display stories and scripts, choreograph movement.
  • Dual-handed weapon IK solvers for natural handling of carbines.
  • Recordable VTOL platform for animating helicopter flight simulation (eg. news choppers).
  • VR Camcorders and VR selfie cams may be rigidly linked to trackers.
  • VR props and firearms may be rigidly linked to trackers.
  • Ghost curves for visualizing the future locations of multiple avatars in a scene.

Sidekick Face Capture IOS app:
APS_LOGO_GRAPHIC_ALPHA APS Sidekick IPhone app - Version: 1.2.1

8 Likes

Custom Avatar Builder Project - Add custom Reallusion/Makehuman models from .blend files.

Reserved…

Reserved For Future.

1 Like

Just added 4-D curves feature to assist in creating fight animations:
https://youtu.be/GFz5oexsJjo

1 Like

Reserved For Future Post.

Frame Rate - control
Just added a “ Frame Rate ” (fps) control.

Projectile’s Mass - firearm prop control

Now recorded characters can be knocked back (using force) when shot:

1 Like

Hey, this is looking pretty cool, and pretty helpful as well. One thing we were wondering as we are doing the first testing of the software is if it’s possible with the exported blend files to also export empties for each of the trackers, controller and headsets raw position and rotation data as well in there. Thanks for the great work!

Hi ReelCaptivation, thank you for reaching out and being so kind. And welcome to BlenderArtists! :slightly_smiling_face:

I like your idea for allowing users to record and export controllers, trackers or HMD raw data (as simple props) for blender. I have just added to the list of todos and should not be very difficult to implement. Thanks for the suggestion!

I am currently adding many new features planned for next release; I apologize if it takes a few weeks to get to this but I will attempt adding this feature soon.

1 Like

That’s great to hear! And no need to apologize, development takes time. Thanks for being receptive to some new feature request.

@Blended_Blue

Dude, I had to make an account to thank you for your work. I made a silly video using your software, and although I still have a lot to learn about it, I had a blast using it. Here’s a link to the video and you are free to use it however you want (though I do curse a lot in it):

2 Likes

Hello Blended_Blue,
Thanks for making the software.
I have a problem with importing a custom avatar - in this case a ybot. I’ve tried several times thinking I’m screwing something up, but they all seem to be too short when judging by my hand position vs head position (eg have to reach way above my head to touch the neck of my imported avatar), and I have to max out the slider in APS to get it remotely close. I’ve also tried scaling the avatar up/down before converting and that doesn’t seem to have any affect. Do you or anyone else have any ideas? Heck, I’d even take the ybot you created with the top-hat in the hopes I can figure out what I did wrong.

Also, I have a few suggestions:

  1. Controller position/rotation offset. It’d be nice if I could adjust the position of the avatar’s hands relative to the controller. Thus allowing for “palm to palm” contact between hands without controllers getting in the way. This would also allow rotational corrections to the direction of the index finger - as currently, with index controllers, the index finger in real life and in APS do not match up.

  2. BVH file export naming. I just take the automatically named humanoid_x.bvh file and import it into blender directly and export it with other bvh files as an fbx file. So it’d be nice if it named the bvh files according to what they’re saved as in APS.

  3. Max Height adjustment? If I can’t fix the import issue with the custom avatar, it’d be nice to have an even greater max height value. I must somehow make the position of my head/controllers relatively accurate.

My personal project involves sign language, so finger/hand positions (especially in relation to hand/body/head) is essential.

Hey, I am sorry that you are having trouble with avatars.

At the moment we are working on redesigning the builders into a single APK (as an importable unityasset) and includes many fixes and improvement. I hope that this release address the issue you are experiencing.

Pardon me, how do we install this? I didn’t see it described in the original post. Also;
does this support Valve’s Index controllers?

To install APS runtime (VR mocap recorder) you must first download AnimationPrepStudio_Lite.zip and extract, then run AnimationPrepStudio_Lite.exe. For Windows only.

Also SteamVR must be installed and running.

And yes Index controllers (Knuckles) are supported. I have also added bindings for Vive controllers, Vive trackers and Oculus touch controllers.

Please note: If you are using any Vive trackers then be sure to set all tracker’s role as: “Disabled” (located in SteamVR’s Manage Trackers menu):


If you need assistance installing asset or avatar builders I would be glad to help also.

Does the prop builder support props with moving parts? Like you can grab a part and move it while holding it, and holding one prop with more than one hand?

Because the prop builder is a simple Unity project you may add Animation components to include motion on the prop. Simple Unity animations should work, particle systems should work, hinges and joints might work … added rigid bodies is not recommended.

only the prop’s position and rotation is recorded so none of the additional animations would be exported to Blender.

And yes dual hand grabbing is supported on all props.

I see. Anyway I could add it so that additional movements, like say moving the magazine were it a bone as part of the prop for example would get exported to Blender? I’m thinking of trying to use this to make animations for a FPS game.

To accomplish that would require two props so they can both be recorded as individual animations, one prop attached to the other - then begin recording once grabbed or unattached. It sounds like a nice idea for a feature, currently this is not supported, but possible I could add similar functionality after next update…