Image sequence playback

I render to an image sequence for animations and have placed the sequences in Blender to render out for my final. But I have never been impressed with the smoothness of my videos coming from Blender. I have also tried in other image sequencing software but not as impressed either. Basically, shots that have smooth camera tracking still appear not as smooth as I think it could be. I used a small amount of motion blur in the original render. Anyone have any ideas on how to get the smoothest render from an image sequence?

I don’t see how the program stitching an image sequence into a video has any effect on the smoothness of the output video.

Try exporting a viewport render as a video, and see how smooth that looks.

Thanks will give that a try. I have noticed a difference though between Blender exports and Quicktime. Davinci as well as open source djv. Exporting from Quicktime appears to provide the smoothest followed by Blender, Davinci and then djv. Of course it all depends on your exports settings as well.

I don’t know what you mean by smoothness. I doubt you’ve overlooked this, but could it just be a difference in framerate from the original footage to the render output?

Thanks zanzio,

Frame rates were consistent. What I am referring to as smoothness is the translation of a tracking camera in an animated scene. It looks relatively smooth but for some reason when combining the image sequence into a final mov or mp4 or whatever package you would use for a digital movie, the camera move seems ever so slightly strobe like as it tracks a still object. And I do mean very very slight. I have seen camera tracks that are perfectly smooth. Some software tends to translate the final version very smooth while others dont do as good of a job using the same parameters.

More motion blur. Render a frame against the tracked background and try to match the amount of blur that you expect at the same distance from the camera.

3pointEdit,

Are you referring to re-rendering the shot or using the compositor to add more motion blur to the render layer?

Unless you have the motion vectors as a pass then you would have to rerender with motion blur. It looks better than the compositors motion blur anyway. But stobing is often the result of objects appearing to have discontinuous movement between frames. A video camera can achieve the same issue if the shutter angle is to narrow, i.e. theres less motion blur.

I did have motion blur as a pass but I dont think it was quite enough. Didnt know if there was an alternative after the fact instead of re-rendering.

Sorry nope. Unless there are vectors to apply a Vector Blur in the compositor then there is no way in Blender to determine how much blur any pixel should get. An Optical Flow system may be able to extrapolate this data from your render but Blender doesn’t have this technology.
You can buy a plugin for the Natron compositor https://revisionfx.com/products/rsmb/natron/

You could set cycles to 1, then it will render perfectly good vectors and depth pass, but the images will look terrible. Fortunately, you already have an image sequence for that.

1 Like

Yea, looks like I would have to re-render the shot. Oh well, lessons learned.