Blender for motion graphics (After Effects)

I’ve used After Effects a lot in the past, for animating still images and masking / enhancing post processing. It’s a nice program, but pricey. I think Blender, with the recent progress on compositing etc. could fill almost the same role. I’d much prefer Blender, plus, I probably won’t have access to After Effects much longer.

The thing is, Blender is a big and complex 3D program, and the functions I want in these cases are primarily for 2D animation. Getting at the appropriate settings etc. often involves a lot of clicking around, changing button pages, scrolling etc.

So I’ve started writing a small script for doing a tedious task that I’ve done many a time. This script can import images as normalized planes (fit in window), set the uv’s, texface, and some material settings. Point it at an image file, and you get a correctly scaled version, with preview, in you viewport. It works, for what I use it for, but some options aren’t scripted yet, and I’ve begun thinking that maybe I should expand the scope of it.

See, I have a friend who’s not a Blender user but is missing an app for doing cut-out animation. Precisely what I want as well. I’m mostly doing just a few layers now, due to the hassle, but making simple cut-out figures could be fun.

I’m thinking about doing a swiss-army knife script of sorts, that would have shortcuts for the most widely used functions. Import, select layers / group, set passe-partout alpha (which I’m doing a lot), render preview. Probably some more, like showing material IPOs easily.

My question is this: What would you think of this? An alternate interface add-on that goes against the grain of the Blender thought process might be controversial. You could accuse it of being a crutch and discouraging users from learning things the proper way etc. On the other hand, it might be a gateway for people that are intimidated by the full Blender. What do you think?

What I’d do, if I had all the time in the world to work on it, would consist of a .blend file with a default layout. This layout would consist of a timeline, an ortho camera view, an IPO or NLA window, and two script windows. One for utility functions like the import, render preview etc. The other for “layers”, stacked on top of each other like in AE. These are really just shorthand for selecting the various images you import, without leaving the camera view to see what’s behind the top layer. Also for ordering them (the script would sort them, Z-wise). Maybe even groups of “layers”, so you could do cut-outs with parenting and pivots etc, and have them appear as one “layer”.

One problem with this is that the IPO window is quite complicated. In After Effects, you can see all keyframes in a layer in the same view, including opacity, masks and effects. It’s really easy timing things that way. I’ve not worked much with the NLA editor, but the keyframes show up there, just not separated into rotation, location, etc. Is there a way to see all animated settings simultaneously, and with grab-able keyframes?

Blender seems to be able to do most of what AE does, but it’s spread out between the sequence editor, composite nodes, and the animation system. That’s why I’ll spare you my thoughts on masking, effects, etc for now and just concentrate on animating static images. But I still think that a unified - and more user-friendly - approach to this would be great.

I’ll stop before this post gets too long (it already is). Any suggestions are appreciated.

I’ll add that I put this in here beacuse it’s script-related, but in hindsight maybe it fits better in another subforum. Hmm…

I was trying to do this just the other day - I think the script would be a great idea.

You could also take a swing at the tutorials available for the NLA… not too much there.

Wasn’t there a fork of Blender once upon a time that was aimed at 2d graphics? I’ve tried to find it, but googling “Blender fork” comes up mostly in the culinary department…

RS

As a huge fan of both Blender and After Effects, I have to admit that yes, much of what can be done in AE can be done in Blender…and there are still a lot of things that can’t (or not nearly as easily). I have always wished Blender worked a bit more like AE in the keyframing department, and I think with time it probably will. What I wouldn’t give to be able to animate a rotating object with the old AE X x XX rotation, allowing for a number of whole turns from an object without having to manually tweak the IPO.

Also, access to keyframeable features in the NLA is something Blender is missing. It would be nice to see basic transform controls listed in a nested menu under each layer. I think these things would not only boost workflow, but help in bringing over AE people to the Blender world.

Maybe an option for you would be to create a split-off version, called “AfterBlender”, patterned to be more friendly to the Motion Graphics newcomer. Then people have the choice. They can ignore it, or try it out if they like. If I knew more programming, I’d offer to help. I think you have an excellent idea.

Definitely how I see it, there’s no reason I see that the opengl stuff can be passed in realtime to the compositor and once the glsl stuff is in place itll look even better!

I thought about writting a sugestion about this same thing, but a little more expanded. Roundtripping is a must to have blender as a mograph tool. By this i mean:

  1. Bidirectional link between viewport render and nodes (and not rerendering manualy)
  • Viewport output to node
  • Node output to texture or overlay image plane in screen space
  1. Other render modes (Ogl, GLSL etc), with appropriate shader presets

As you see the thing to do is viewport overhaul.

Hi - about year ago I’ve struggled with the same problem. I’ve startedo to write a set of scripts that could transfer the camera motion between AfterFX and Blender. I gave up because I’m no longer using Blender (got Maya for some time).

Here is the code, I don’t guarantee that it works, it’s ugly and unnefective but sometimes it does the job. Maybe it’ll help in some way.

http://www.oceaneffects.com/afterblender/ae_exporter.jsx
http://www.oceaneffects.com/afterblender/ae_importer.jsx

http://www.oceaneffects.com/afterblender/bl_exporter.py
http://www.oceaneffects.com/afterblender/bl_importer.py

This would be really cool as something to add to the 2D sequencing functionality in Blender. It’s long been all but ignored in terms of functionality, but really is a very powerful tool that I would use in preference to any other video sequencer if I could only get the danged ffmpeg working for audio multiplexing!

hi kroopson, these scripts look interesting, ive used the blender expot py scipt to make a .aeb file - how do i then import that to after effects - do you have a little instructions anywhere,

muchos gracias,

krupa (nice name by the way)

nice reel too!

first of all those scripts are being designed for After Effects 7.0 Pro because only this version has the javascript support. You have to enable javascript in preferences. Second of all it only works properly when you have something I’ve called “after effects camera”
It’s being created with this script:

import Blender
from Blender import *
import math
from math import pi

scene = Scene.getCurrent()
empty1 = Object.New('Empty', 'Empty1')
empty2 = Object.New('Empty', 'Target')
empty2.setLocation(3,0,0)

scene.link(empty1)
scene.link(empty2)

camobj = Object.New('Camera', 'AECam')
camdata = Camera.New('persp')
camobj.link(camdata)
scene.link(camobj)
camobj.protectFlags = int('111000111',2)
camobj.setEuler((90*pi/180), 0, 0)

empty1.makeParent([camobj])

empty1.constraints.append(Constraint.Type.TRACKTO)
trackto = empty1.constraints[0]
trackto.__setitem__(Constraint.Settings.TARGET, empty2)

scene.update(1)
Redraw()

This camera has empty as translation controller, point of interest and is capable to rotate independly from point of interest. The problem I’ve faced when I was writing those scripts was Euler Rotations. I didn’t know how to change rotation order (afterFX has different rotation order than Blender so rotations has been screwed up a little:) )

Blender has a serious flaw in it’s Video Sequencer at this time.

http://blenderartists.org/forum/showthread.php?t=108949

You can not resize the length of the composition and then footage without blender crashing.

Until Blender has a way to draw masks, I will always have to go back to After Effects for that. I love the After Effects masking tools, not to mention tracking, cornerpin, text layout and all the keyboard shortcuts that I have grown so used to.

Heh, I don’t really use or understand the NLA myself, as I’ve not done much armature animation in blender. I did play with it yesterday, and I actually found a way to get at the Loc/Rot/Scale IPO’s as AE-like keyframes.

Select an object with IPOs, then go to the IPO editor and press the small action-button to the right of the curve menu. Then go to the Action editor - expand the Object and IPO Curves points. There they are, per channel - you can even get sliders there.

Whoah, creating a fork is way over my head. What I’m talking about is more of a hack or additional tool that makes certain things easier.

Blender lacks a lot of things, like the tight feedback loop, transparency in viewports, stable sequence editor. But I think it’s possible to make a utility script that will automate some of the stuff. As it is now, I think there are some big stumbling blocks, mainly having to do with how the viewport/renderer works - no non-alpha masked transparency in the viewport, no blending modes etc. That means that you’ll have to split animation from compositing, which is a huge drawback compared to AE. But with OGL viewports, combining them should be possible. (I see now that it’s possible to write your own OpenGL viewport in python! A bit difficult, though.)

Thanks, everyone, for the thoughts. I’ll be working a bit more on just the import script, and post it when it’s ready.

Just as an example, I imported a single scanned page with the half-done script, and animated it. It’s somehing I’ve done in After Effects often enough, to make VJ footage out of a still image. Just using the standard animation tools in Blender, animating ten seconds took less than ten minutes, and suited me more than doing it in AE.

Oh, and thanks for the scripts, Kroopson. I don’t have AE 7, but maybe I’ll take a look when it’s time to work on the script again.

And about animated masks, that’s a drawback in Blender. It could be done with the compositor and render layers, and I just found out that you can actually make shape keys for bezier curves. Maybe a second tool is in order, to make bezier shapes more like in AE where you can make keys arbitrarily and not have to worry about shape IPOs, and also set up the compositing nodes.

Whoops! I meant NLE, not NLA.

RS

Wasn’t there a fork of Blender once upon a time that was aimed at 2d graphics? I’ve tried to find it, but googling “Blender fork” comes up mostly in the culinary department…

Perhaps you’re thinking of:

http://positron.sourceforge.net/

%<

I think Zanqdo wrote a script to export to Maya Ascii, which AE can import. I modified the script for Shake, tied it in my 3delight script and it actually works with both AE and Shake and I think it’s supported as far back as AE 5.5 possibly earlier. AE slows down a lot more than Shake does though after importing all the keys.

Export is slowish if you have a lot of rigged meshes too because the set() function updates all objects in the scene not just the camera.

I used it for a motion graphics style project recently and it worked out pretty well. I had some 3D elements rendered out with various passes in 3Delight on alpha and then I imported the tracking data into Shake and attached various effects and images in the 3D space in the compositor and they matched up pretty much exactly (once I got the right filmgate ratio).

I think doing it all in Blender would be great though as the compositors’ 3D environments are waaaaay slower than Blender’s and not particularly intuitive for moving stuff around or animating. If Blender’s nodes could be customized through some sort of API like Python, adding functionality similar to AE plugins shouldn’t be too difficult. Someone ported most of the AE plugin functionality to Shake through macros and they behave almost exactly the same way.

i have been using the other scripts, but i am getting odd discrepancies with comping later, though i would love to ditch ae as you suggest - blender could have full feedback of the 3d environment through the compositor, with ideo overlays and stuff… if only i could get sound out on my mac…

rocketship, I’m not that good at the NLE either.

Say, anyone got an ETA on the next Blender version?

I’m asking because a 250 frame render of a single (large) image plane just took me an hour. This is an old computer, granted, but it plays the animation fine in the viewport, so OGL rendering looks pretty attractive.