Affordable Motion capture system for Less than 1000 dollars by Enflux

Hey everyone,

I have been working on a side project of mine for a while now with UE4 and Blender. For me animation is hard as I am mainly a modeler. I was looking at getting a perception neuron but then I saw EnFlux.

What is it?

It is a Mocap system that runs for about 499 dollars US. It looks really solid.
I have not tried it yet. I am looking at ordering it.

Obviously the perception neuron has hand tracking but costs 1k to 1500 total depending on what you want. So any form of cheaper mocap solution is welcome.

My question is has anyone here tried it?

You get what you pay for. Suits worth having start at around 10-20K

Moved from “Latest News” to “Blender and CG Discussions”

Blender’s completely free, but it’s far from worthless.

It looks like it brings better results than ipisoft and 2 kinects, which is more than 500 bucks all together iirc. The results look super messy and slidey but you could definitely set yourself up with some super good base poses from this.

@OP, your link is dead, by the way.

Fixed the link. :slight_smile:

And it’s terrible. :slight_smile:

Eventually we’ll see a good option, but after my experience with Perception Neuron I’m pretty skeptical.

1 Like

Ok, my 2 cents, since I was recently doing a research for cheap mocap:
Enflux results look terrible for animation, however it seems rather as a controller for VR, where it might be great to enhance the experience.

this suit delivers a bit better results : https://www.rokoko.com/en/shop

Ipisoft seems to deliver reasonable quality, but I didn’t manage to calibrate my 2 primesense carmine 1.08 properly, and the results from these 2 were actually worse than from 1. but that’s just my test.

Perception neuron isn’t also flawless, but seems to be the best solution for the price.

There are 2 very simple solutions I’d recommend, and I’m thinking about.:

Import kinect/s 3d streams into blender(i remember there was a plugin or something, you can also do a displacement object and import as video texture), then use the stream to do your animation - it gives you enough info about space-time relations, and you save time cleaning up the capture data or trying to retarget it.

Use some bought or downloaded mocap data, there’s just huge amount of it, and just animate the parts you can’t find in the library.
This data is usally allready cleaned, so it saves you a lot of time, especially because you might be wanting stuff that allready somebody recorded.

It looks like a floating feet simulator.

The Rokoko Smartsuit Pro is an alternative which seems to give better results than this. Results are generally very stable. It’s more expensive but still very cheap relative to traditional mo-cap setups.

https://www.rokoko.com/en/shop

How does it compare to Perception Neuron ?