Object change instance based on parent's rotation. (For 2D animation.)

solved
(Ranstone) #1

So, I normally use FlashCS6 for 2D animation, and compared to blender it’s quite superior for 2D animation, but it also takes a HECKIN long time. I’m creating some similar 2D puppets, but for blender for speed, as well as a variety of advantages.

Now traditionally, there are a variety of ways to key-frame 2D animation in blender, but I’m interested in finding a way to make the process as autonomous as possible. I want a puppet comprised of billboard planes with images mapped to them. These planes will be grouped into “Symbols” similar to Flash CS6, and based on this symbol’s rotation, individual objects within the symbol would change their image.

Here’s a concept I made.
net-gifmaker

However, I also need to be able to manually over-ride the rotation on specific frames, and specific objects. The head would be a separate symbol from the body, ect.

I know this is not what Blender was created for, but I think it could be a great asset and have several advantages over Flash, especially in background assets.

Additional notes:
1.No, I don’t animate ponies. It’s just an example.
2. I can’t use drivers, as I need multiple symbols/instances to rotate independently.

I’m looking not only for help on the best way to do this, but also your input if this would actually be practical. FlashCS6 is powerful, and although it takes a months to make a 5 second animation in it, it will be hard for blender to compete in quite a few ways.

Tips? Hints? Tutorials? Thoughts?
Your time is appreciated!

(Jason van Gumster) #2

So what’s your input here? A sequence of image states (basically sprites)? Or are you intended on drawing in Blender using Grease Pencil? Each approach would require a slightly different technique.

(Ranstone) #3

The grease pencil for animation? That’s a new one to me, but no, I’m not interested at this point.
My interest is in changing individual cells or “sprites”.
I’m currently looking into shape-keys, but I think that’s actually slower than Flash.

(Jason van Gumster) #4

That being the case, there are two possible ways that I can think of handling this, dependent the type of image input you’re using.

  • A single image sprite sheet: If you set up your texture correctly with every sprite on a single image file, you can drive the offset of the texture so only the sprite you want shows up on the plane. This is very easy to set up, but requires you to build a full sprite sheet in advance.
  • A sequence of image sprites: Set up as an image sequence texture, you can drive the frame offset for the image sequence to get the right sprite to show up. The image sequence is often easier to generate than a sprite sheet, but sometime driving that frame offset value can get a little tricky.
(Ranstone) #5

Definitely more efficient than the shape-keys I was trying, but where I stand now, It still would require frame-by-frame manipulation by the user.

I’m pretty bad with python, but I suppose I could use the parent objects rotation on the Z axis to define UV offset, EG “If the rotation angle is more than 0, but less than 45, use sprite#1, ect”

If I did this, how would I manually override the automated UV offsets for fine tuning the animation, EG, turning of the head, moveing limbs, lip-sync, ect?

Thank you so far BTW.

(Jason van Gumster) #6

Rotation-wise, one thing that might be interesting to try would be to see if you could drive texture choice based on normals relative to the camera, either in the shader or in the compositor. I’ve done a simple version of this for single planes using the backfacing output from the Lightpath node. Might be an interesting experiment to try using normals to have the same result. I might play with this today if I have a chance.

The sure-fire way to do this, though is with drivers. And because you can have alpha transparency, you can overlay textures over top of your base for stuff like lip sync.

1 Like
(Jason van Gumster) #7

And it turns out that it is totally doable in the material, if that’s your thing. Have a look at the attached file. It’s a quick and dirty example, but it should get the idea across. Using just the X channel from the Geometry node’s Normal output, you can control which texture gets shown based on how far the plane has been rotated. The node network looks like this:

So your result looks something like this:

Here’s the attached .blend for you to review: angle-texture.blend (686.6 KB)

(Ranstone) #8

… Oh my gosh… XD
This is why I love blender. The community’s creativity is endless.
That’s brilliant, honestly. I was worried about drivers as I need multiple versions of the instance, but this can work.

I especially want to take the time to thank you for going out of your way to make the blend. I’ve been around since before Cycles, but I’m still not great at blender, and this is much appreciated. Thanks a heap!

1 Like
(Jason van Gumster) #9

I’m glad I could help. :slight_smile: If you have any trouble with the file or need help deciphering it (I admittedly rushed it together), please don’t hesitate to ask.

(Jason van Gumster) #10

I was a little dissatisfied with the example I put up for you yesterday, with the funky split from left side to right side. Here’s a new version that normalizes the X value from 0 to 1. It makes a bit more sense this way. Here’s the new node network:

And the new .blend file: angle-texture.blend (673.6 KB)

(Ranstone) #11

This one is even better. Thanks again!

1 Like