Fairly sure there’s no way to do this, because I can’t even begin to imagine how various ‘clumps’ of voxel data could be communicated as ‘hey, we’re a bit of fire and we’ve just moved…’ but I thought I’d ask anyway, to see if there’s maybe some way of doing it…
Although not in blender I imagine you could use an optical flow algorithm to determine motion vectors for a vector blur
Yeah I just tried to slow it down using optical flow in premiere, got a strange result i think because of the speed of the flames and it being at 25fps.
Oddly, rendering out now at 120 frames per second, my resulting video looks a lot more like real world flames…
Actually forget that - the 'play rendered animation ‘plays back at the rendered framerate’
Can you maybe create a mask for the flames, and use that to generate a motion blur pass that you can then apply to the flames in a comp ?
Not sure how I’d go about that, but I got good results using the pixel motion blur in after effects.
The smoke simulator records an additional voxel grid called “velocity” that records which way the cloud was moving through each voxel. But Cycles doesn’t do 3D motion blur based on this as far as I can tell. The grid itself can be called from a Cycles shader, so maybe you could do an extra render layer with some material like this, then use it with vector blur? (custom AOVs would be really nice for stuff like this…)
Great another undiscoverable “Attribute”, that makes Blender very approachable. Otherwise a very cool idea!