I was doing some performance tests in my pc and noticed that it can handle static models over 20 millions of polygons very easily. I can rotate and zoom in and out the models in the viewport apparently without losing performance.
However, when it comes to character animation, it can barely handle models with just 40k polys.
Why animating a character has such huge impact on the pc’s performance? What’s the difference between moving the model on the screen by rotating, panning and zooming it and moving the character on the screen by moving its arms, legs etc?
Blender’s armature modifier is very, very slow. If you compare armature deformation in Blender to armature deformation in any game engine, you’re going to find that Blender comes up really short. By a mile. Why? I don’t know. Poor optimization? Unlimited numbers of vertex groups? Something having to do with hardware accceleration? This isn’t a Cycles vs rasterizer situation-- game engines aren’t taking any shortcuts to armature deformation. Weights to a single bone shouldn’t be any slower to calculate than bone parenting, but in Blender, they are.
But this is mainly a problem with Blender’s armature modifier. Its other modifiers work much more quickly (and in fact, it can be faster to armature deform a low poly and then use the low poly to mesh deform a higher poly model. Which seems bizarre to me.)
And if you don’t actually deform an object, but just translate/rotate/scale it-- for example, by bone parenting it, instead of using an armature modifier-- Blender is very, very fast. Its performance with “static” (really, non-deforming) objects is really really good. Super optimized.
I know right. It’s so slow for apparently no reason. I still have to test how the performance is gonna be in Unity when I import my rigged character from Blender. I don’t know much about it yet but as I’ve read, unity has an animation system called Mecanim that’s fully compatible with Blender’s Rigify. Thank you the answer.