hair subframe motionblur

Hi all.
I can see that blender’s cached hair blurs when motion blur is enabled, but it doesn’t seem to interp. on a sub-frame level, ie: it’s motion remains static through the motion blur. is there a way to rectify this?

Glenn

use a higher frame rate

I don’t understand?

Glenn

seems like a bug to me. You mean real mblur, not the vector one from compositor?

Yep. Bake a simulated sequence and park on a frame where the hair really flicks about. Then, during motion bluring (and you are right, I don’t mean not vector bluring), the whole head of hair moves and blurs but the strands don’t seem to be subframe sampling. I guess on fur it’s okay but for long hair like pony tails, it’s a problem.

If anyone lese can try it out, I’d appreciate it.

Glenn

The position of anything is calculated on a frame basis when it is rendered. So, its at position at frame 1 and then it is pictured in an image. Blender then sets time to frame 2, and calculates position X. Then for frame 3 it uses the physics and ipo curves to determine position Y. For motion blur, it then uses an algorithm to figure out where the thing (hair, whatever) would have passed through while the shutter was open or for that time frame, and draws the appropriate pixels to make the blurred image.

There is no sub-frame sampling for motion blur afaik. Blender assumes that between the start of frame 2 and the start of frame 3 the thing has moved is a continuous path through space and time to get from point X to point Y.

SO, if you want blender to calculate and show where something is at frame 2.5, at 30 fps (which is at 0.833 seconds), you have to bump up the frame rate to 60 and then frame 5 will be the position of it at 0.833 seconds into the animation.

…and still the hair will not blur correctly. I see your poit but just as Blender knows the loc of a sphere at frame 1 and 2 abd knows how to interpolate the arc for the blur, it should know the location od each knot or spline point of the hair and where their trajectories go. I wonder if it is in the chahing of the dynamics? It seems to be a shortfalling to have to over render a sequence and over crank the motion blur (to compensate for the amount of blur once the shot has been varisped back to realtime). Who in a production environent wants to render double (or in reality more like 6X) the number of frames to get a represenitive number of samples for a halfway descent blur?

Seems like a bug or an oversight if it’s true.

Glenn

there was a bug with tiny fluid pieces not being blurred, but I think that was fixed. I just ran a test and my hair blurs, so you will have to be much more specific on what you see as a bug. My test that works is I make the cube emit hair. I key the cube locations, and render with MBlur on. The hair strands are blurred.

As I mentioned earlier “the whole head of hair moves and blurs but the strands don’t seem to be subframe sampling” ie: the hair blurs following the vector of the emiiter, but do not have thier own subframe vectors. They do not interpolate between frames. If the head moves straight and the hair is swinging around, the hairs blur stright, follwing the trajectory of the head but do not blur their swinging motion.

Glenn

papasmurf - if blender wouldn’t be doing subframes what would be mblur for then? you could use always just vector blur. If Mblur isn’t actually rendering subframe, then it really makes no sense… I’m quite sure that ipos can give back subframe positions, but maybe baked hair is baked on a frame-basis with linear interpolation, and then even subframes don’t give a curved look, if that’s what we are talking about here :slight_smile: since really some image from grsaaynoel could clarify what’s actually the problem.

yea, i’d have to refer to someone who knows the code to give an accurate answer. I remember a debate about this topic a year ago maybe on this very topic. I just remember that Motion Blur took into account the past, present and future, and weighted the sampling based on Bf. Glenn, have you tried your hair on Bf: 5.0 ? that is max sampling.

But again, my hair blurs with this test case: add a sphere, make it emit hair. Add a wind force, and ipo its strength value. Enable mblur, and the hair is blurred. You can see Blender calculating the different images and then combining them. the sphere itself does not move, only the hair. and it blurs as per its motion.

http://www.glennmelenhorst.com//misc/blender/hairs.blend

Render this and you’ll see no motion blur as the hairs settle. I have even cranked the Mblur to maximum. :slight_smile:

Glenn

Try using both motion blur and vector blur.

Just a thought.

Unfortunately, that’s not how the old motion blur works.
I don’t know Blender’s vector blur code, but any vector starts at one point, goes in one direction, and has amplitude (strength, force, .etc).
You can define a vector for point x, by locating point y, using its location for vector direction, and its distance for amplitude.
Things don’t always move in a straight ray between frames, so vector blur isn’t always desirable.
It is much faster, and I’ve always assumed that’s why it’s there.

The old motion blur system renders several “sub frames” based on the OSA and bf settings.
It will render the entire screen 5 times if OSA is set to 5, with interpolation being controlled by the “bf” slider.
This can lengthen the render time of a single frame drastically, but it’s the only method Blender has to achieve intra frame motion.
It’s not working with the particle system right now, since editable hair was added.

I can confirm the same problem with just particles as well.
Weird things going on, as well.
It works with a plane set up as a particle emitter, until you change things like particle life.

You can ignore the following, since it’s not what I remembered.
That plugin does NOT work that way.
I’m leaving the info up, in case someone may know of another software that does that.

Virtual dub used to have a motion blur plug in that basically does the same thing as Blender’s. (**edit…not true. must have been something else but what?)
You would have to remap the animation.
Let’s say the plugin was set to 4 samples, and your ORIGINAL animation was 300 frames.
You would need to remap your animation to 1200 frames.
I haven’t used Virtual dub in ages, though.
*edit… forgot link
www.virtualdub.org

There is an advantage to doing it this way, but it’s negligible.
You have complete control over intra frame motion, but it’s extremely rare to need that level of precision.

That’s wat I thought. Thansk for the heads up, 'm not going mad after all. :slight_smile:

Glenn

Hi,

I think I had the same problem with MBlur not working correctly with baked softbodies. I wonder if it’s because the results are baked on a frame-by-frame basis but, as you’ve said, MBlur needs subframe information.