The Art of Re-targeting or The Super Powers that sucks
As i’ m working a lot with motion capture till a long time, working for triple A games or commercials movies, i was disappointed that Blender did not handle re-targeting very well. Motion Capture Tools’ addon was a good attempt but not supervised by Post animation users. the approach is too constrained by the old BVH file format. And the fact that we cannot only import the animation in an existing scene’s character directly from the mocap format is not pipe-line friendly.
There is a lot to talk about that subject.
I’ have written a script (with my poor coding skills) to create an Artnet server long time ago and release it here few years ago for the game engine. I have ported it for blender 2.8 viewport for others projects.
the idea for the mocap pipeline ll be for me, to drive Empty’structure (like in the joined file) by network from any kind of python friendly’s software (Mobu/Blade/Vicontools…etc)transporting few floats or vector to feed them.
In this file there is only a bunch of bones, drives by an fbx Raw quality set of markers (it’s one of the multiple manners we use before retargetting and post animating in MotionBuilder. No network stuffs at the moment.
It could be more perfect, but it’s a rough study on how to post animate without hegemonic’s software.
Proto_Man_Skin_02_Retargeting.blend.zip (598.3 KB) blender 2.81 Beta
The Character is not intended to be nice but dis proportioned to test the re-targeting.
i have created it really quickly from skin modifier.
With some driver’s tricks to avoid collapsing meshes on elbows.
Hands were post animated by me because there was no mocap on the file.
I have tried to Make the file presentation a little bit attractive with shading and lightning for trapping guys who hate reading technical stuffs.
Tentacle mutation is really a “superpower” that sucks!