MocapNET - motion capture for everyone!

I’ve been looking for motion capture software several times, and I’ve finally found one! MocapNet https://github.com/FORTH-ModelBasedTracker/MocapNET/tree/mnet1
This program makes a BVH file from just one video, e.g. can be imported with Blender.
https://www.youtube.com/watch?v=Jgz1MRq-I-k
Download for Linux x64 r01:
https://www.mediafire.com/file/39ln3hy2ecxs6uq/MocapNet_BVH_Script_r01.zip/file 605.93MB
There is a “demo.webm” in the archive and you have to use the terminal.
./make_bvh.sh demo.webm 375
“demo.webm” is the video file and “375” are the frames from which a BVH should be made.
During the process some windows will be opened, the terminal will output something and the output should then be “out.bvh”.
The demo video is actually not that suitable because she rotates and moves very quickly, but you get an impression of the program.
It should also be possible to install a Linux in “VirtualBox” and then run my MocapNET script. https://www.virtualbox.org/
Yeah - motion capture for everyone! :smiley:

2 Likes

I forgot to compile the dataset/gestures and have problems making MocapNET binaries with BVH enabled.
Please do not download the “r01”, it will not produce a good BVH.
I hope I can patch this soon.

I was able to patch it and also made a video with the result that I achieved. Pretty good for just one video. :slight_smile:


Download for Linux x64 r02:
https://www.mediafire.com/file/8ogp98c8cva8pbi/MocapNet_BVH_Script_r02.zip/file 461.49MB
I still have a hint, stand in the T-pose for the first few seconds and then start with the motion capture.

The video you are using should only have 24 frames per second, not 50 or 60 FPS, the result is not so good due to the high number of FPS.
The motion captures that I made are useful but not perfect, I’m still experimenting how to get the best quality.