A way to make armature animation more efficient?

I’m wondering if there’s a way to do things more efficiently with animated characters? Like, if they’re moving to a door, is there a way to make the character detect the door, and trigger the stop and open door NLA tracks without me having to do it manually?

Basically, I’m looking for a way for the character to interact with the environment like he could in the old game engine, through detection of objects, and triggering animations?

Thank you. :slight_smile:

Hello !

By default this isn’t possible, but I would be surprised if something like that won’t be possible with some python…

I’ve seen a few parametric animation addons, to automatically build character walks or going through stairs…
Maybe the only flaw with what you describe is that there might be a lot of “concept” to integrate, walking to a door and opening it might be ok, but say now you want to do the same with a microwave to put something into it… while it’s very close it’s probably going to ask another set of pre-defined animations,

and the more you want to put acting in the animation the more difficult it gets, “the character is in the morning and do a sleepy walk to the door” is going to be different from “the character leave his house in a hurry because they’re late” walk…

you’re applying a more “video game” logic to animation, but this is limited as in general movie/precalc animation is more refined and done manually for each shots…

Just 2cents of food for thoughts !

2 Likes

Thank you for the explanation. :slight_smile: When I used to work with the game engine, the logic brick system helped me to create some pretty intricate “Automated” character animations, and interactions between other characters, pretty fast. Doing it manually, to me, seems like an inefficient character animation method when automation could make it go a little faster.

I love using Blender, I just wish there was a way to use something like nodes, or game logic, to drive character animation and interaction rather than having to do it myself. :slight_smile: Perhaps the newer versions of UPBGE will have rendering capabilities that will work with logic? I’ll have to look into that.

To clarify, I’m only referring to basic interactions, like detecting a door, stair detection etc. To me, the biggest issue with character animation is having to do all the basic interactions myself. I have a feeling Python’s going to have to be utilized if there’s no way to do it in the node system. I was just looking for a simpler method first.

Thank you again for the input, I really do appreciate it. :slight_smile:

1 Like

UPBGE has Eevee render if I am not wrong. To have it in normal Blender, you’ll need to write Python add-on (just normal text scripts won’t do it since you need access to modal functions, I think, in order to be able to do things on scene update. I considered similar thing for geonode-based LOD switching system (it had to track the viewport camera position), but I dropped it fast, because Python is too hard for me, and it’ll need starting an addon manually every time.
Probably simulation capabilities of new Geometry Nodes can do something like that. They definitely can “do X while other object in proximity”. The only downside is that geonodes do not work with armatures and object level transformations. You have to either place your door instances on vertices of some proxy mesh, or pass transform of an empty into geonodes using Object Info node.

1 Like

I’ll definitely look into UPBGE then and give it a shot. If I can utilize Eevee to render out a logic-based scene that would be wonderful. :slight_smile: