I use blender on PC, windows 10. Have modeled characters, that are ready to be rigged and animated in Blender, but I need to use a motion tracking. I have Kinect for Windows V2, but I’m unable to use it,
because NI Mate doesn’t support any windows versions further than 7 and I can’t find any other ways to use my Kinect without NI Mate in Blender… Any suggestions? Please help, I’m really in a big trouble… Are there any alternatives to NI Mate, or any other ways to use my blender models and Kinect…
Thank you very much Lobo Tommy!!! It’s ridiculous that I’ve got only 1 comment here regarding my issue… Still lots of problems and question marks, but at least saw a skeleton in Blender that kinda’ repeated my motions… In case you already made some motion capture with help of this particular video tutorial and also used the rig made by the guy from the video, did it work for u on an actual model? I mean, the process is really complex, like retargeting of the human meta rig to that rig and stuff… I’ve read comments and people complain about various issues… Like for example, if I have to match the rig to the rig created by this guy, then does it kinda mean that I have to create models based on that particular rig? Like, if I have the fat guy who’s really short, then how do I use that rig? …And the flipping to the Y axis (people say that it distorts everything and model and ). And also fingers… Is it necessary to remove fingers? I do not expect to capture finger motion, but I was hoping to use blender pose libraries for hand poses, fist, loose hand, point finger and stuff… Otherwise how should I do this? I know I’ve got too much questions, but I need to capture motions really badly, if this s**t won’t work, I will need to buy some expensive device and it would really be a problem… Please help
I haven’t tried live on a rigged model. I’ve retargeted .bvh files using makewalk before though. Was a while ago, but after text editing a definition file (.trg) it was an easy process per .bvh. Don’t think there were major size differences but the bone count was mismatched. You’ll have to experiment, search or wait for someone more knowledgable in these matters.
Shows some of makewalk and bone armature alteration effects (@~10minutes):
I’ve had success with an old pirated version of now-dead software Faceshift and a realsense sensor.
But in modern terms its seems like the 4k BRIO webcam and iClone is the cheapo way to go - that or an iphone x and some custom software(basically doing what Faceshift did, but betterish).
Hey @LoboTommy . I’ve followed your link and found that this makehuman retargeting stuff can be really useful. What I can’t do yet, is getting some information on how to combine various bvh files into one file. It’s like u’re taking the armature that is parented to the object, then pressing ‘load and retarget’ in the makewalk addon panel and the bvh is assigned to the armature + model. Then if you want to add another bvh, you can’t load and retarget again, because it will reset the previous bvh action. I tried to just copy the keyframes from the dope sheet of one bvh animation and paste it to another. It works, but the problem is - positions are different. i.e. I took run animation and paste it into the walking cycle, then the character walks in the different place and then jumps to another to run… How to fix that? I mean how to make the location and position of the character and armature match on the axis… Manual frame-by-frame approach of correction the position is really useless, as it makes the animation jiggle…
Else similar could be done via Excel, python, maybe bvhacker.
I’m hazy on actions, youtube search is your friend there till the cavalry arrives.
Edit: This works for combining multiple bvh files via Action Strips.
Edit2: You can also combine bvh files via a text editor, will just need to adjust the header data (frame count). Check out bvhacker also, it’s useful and on github now.