EDIT: They don’t. They slow down Blender about as much as they ought to. It’s something else with what I’m doing.
I’m working on a model with a lot of constraints and a lot of bones, and it seems like it leads to some serious slowdown, and I can’t wrap my head around it. I don’t have any loops. This is particularly noticeable when undoing a bone transformation in pose mode.
I’ve implemented some of these constraints on a per-vertex basis in vertex shaders and I know they’re not a big deal even on a per-vertex basis, much less on a per-bone basis. A good example is limit distance, which seems to create a lot of slowdown even though it ought to be the simplest computation in the world. I’m not familiar with IK algorithms, but my IKs aren’t responsible, at least not alone responsible, for the slow down; speed is fantastic with just the IKs. And the way these things ought to be evaluated, serially, each bone transformation feeding the next, I can’t see how constraints would interact in any way except additively (not multiplicatively, not exponentially.)
What is Blender doing? Is it doing some permutation thing? Is it just trying to be too smart? Is there some way that I could just tell Blender what order to evaluate my bones so that it stops thinking so hard?
Try it with a simplified model to see if the bottleneck really is the armature and not the actual mesh deformation. Then maybe try it with the new depsgraph ? Who knows, it might bring improvement.
Well, it’s not the complexity of the mesh. In that it persists after deleting every mesh.
But thanks for making me test that, because maybe it’s not the armature. In that when I copy it into a new file, it works fast. Even when I copy the mesh into that new file as well.
But it’s not undo buffer disk swapping either, because I’m not using that much ram, just consistently using the entirety of a single CPU core (judging from resource monitor) and because it persists through closing and reopening.
The only weird thing left about this file is maybe all of the scraps in the image editor, I’ll have to try to clean that up. Don’t know what’s up with Blender keeping my images long after the materials referencing them have been deleted… Wouldn’t mind any suggestions about how I’m handling that wrong, if you have any
Then maybe try it with the new depsgraph ? Who knows, it might bring improvement.
I hadn’t heard of that. I just tried some quick googling and what I read didn’t make sense to me. Is it easily explainable?
Edit: but it takes no time at all for the fresh file to start slowing down in the same way…
That’s weird, maybe some memory freeing failing ? I can’t think of anything in Blender that would behave like that. File it as a bug ?
In case you want to try anyway here’s the howto.
Thanks! I don’t think it’s related to what I’m doing so far-- seeing as how I don’t understand anything people are saying
Problem with this file is apparently a hundred thousand drivers that I never consciously made. Lag disappears after clearing them, but it seems I have to do so every once in a while during editing. The constraints are fine, speed-wise. I’ll learn more later.