Motion blur and hair

Hi guys,

We recently completed production on a 3 minute animated short using blender and cycles for our rendering pipeline. Overall it worked awesome, but we hit a huge snag with 3d blur and hair, forcing us to use 2d blur to finish the production.

basically, our render times skyrocketed on shots with a lot of 3d blur and hair. I know the gooseberry team also have been facing this issue and I remember them talking about their plans to address this issue. But I haven’t been able to find any info since.

So I was wondering if anyone knew if the gooseberry team managed to solve this issue, and if they didn’t what kind of workarounds did they come up with?


yeah hair and deformation motion blur is really a problem we couldn’t find a solution for (yet). that’s why, due to insanely long render times, Cosmos is currently being rendered completely without motion blur.

The “motion blur” implementation, as far as I see, is very inefficient because it renders several frames each time, such that it renders the same frame repeatedly instead of keeping it.

Nevertheless, I think that “blur is like Brylcreem: ‘a little dab’ll do ya.’” :yes:

Simple vector blur, taking the vector (say) from the head, will produce a very believable result and can be done “in post.” All that you really need, to carry the effect of smooth motion, is blur of some kind.

Another trick that I have used is to superimpose a non-blurred track of an object, at 75% opacity, upon a blurred track at 25%. The numbers, of course, total-up to 100% when the two channels are combined with an Add node. The outcome is a mixture of blur and sharpness, done “computationally on-the-cheap” in the compositing stage.

And … if someone’s mind is set on “luscious, individually-bouncy hair,” use a little more of that Brylcreem. The hair doesn’t have to bounce; it doesn’t even have to consist of particle strands. Give your star a nice hairdo that doesn’t bounce all over the place. Maybe drop some particle-hair action on a few extreme-close-up shots early on, to set the viewer’s expectations, then use a simpler strategy in most of the shots that are taken at any distance. You might notice the difference, but if your production and your story are compelling, the audience won’t care.

I haven’t looked into it, but my conjecture is that Cycles extrudes the bounding volume of the hair primitives into time, which can cause extreme overlap for dense geometry, thereby destroying the benefit a bounding volume hierarchy.

I would suggest that you try doing motion blur by temporal supersampling (like in Blender Internal). That is, you would render 10x or more (sub)frames in the same time interval and then composite them into a single frame. That’ll be a lot of frames, but you should also be able to get away with much less samples per frame.

What you describe sundialsvc4 sounds like Blender Internal. Cycles doesn’t render one frame several times.

Vector blur got the job done, but I HATE the look of vector blur. I hope the dev team can find a way to improve deformation blur at some point. I imagine the it will be a problem for heavy render-time displacements as well when/if that capability makes it into cycles.

Agree on vector blur kinda sucking. Same with the defocus blur, it looks good enough if you don’t look too close.

It’s a bit of a bummer that motion blur is getting the axe. But I guess you have to compromise somewhere…

Would it make sense to leverage vector and implement Optical Flow to interpolate a reasonably real ‘inter frame’ solution, like old “BI Real Motion Blur”. You could just turn on the interpolation engine for as many frames as you need.

Can’t motion blur be disabled per object? If so would disabling motion blur on the hair emitter disable motion blur on the hair?

Just curious.

Isn’t Appleseed quite efficient when rendering with deformation MB? Might be useful give a check, having to disable completely MB for a movie sounds quite “scary”

I think that would work, but it would look horrible. Additionally, you don’t want to mix 2d blur and 3d blur in the same shot if you don’t have to. That would also look horrible. :slight_smile:

I’ve been really interested in Appleseed, but I would love to see a bit more documentation and tutorials on how to use it. It seems rather incomplete at first glance, but very promising nonetheless.

After effects and reelsmart motion blur… like vector blur on steroids… can input motion vectors and/or use motion estimation.

Ive never gotten that to work with a vector pass generated from Cycles. Does it work “out-of-the-box” for you? If you need to do some kind of conversion, can you share your workflow?

An optical flow pipeline would be awesome.

Better than that would be good 3d motion blur. :wink:

How about using the compositor to mix in just a touch of the previous frame to add a little blur on the moving bits?

I know this is probably not optimal. If I understand the way motion blur works it calculates the position of an object during the shutter time setting for current frame. and then blurs it together.

I have only used it sparingly myself and when I did i set the shutter time much smaller than the default 0.50. Otherwise the moving object becomes a bit too transparent for my taste.

Good 3d motion blur is definitely a plus, but optical flow could integrate into the clip tracker, allowing motion blur on video. It’s certainly not a replacement for a solid motion blur.

Appleseed has a Blender exporter that works for the most part. The renderer itself is still feature incomplete, but it is solid and has a very nice motion blur implementation. With any kind of dense geometry it absolutely slaughters Cycles