Frame-rate to Motion-Blur from videos?

I have experience in recording actions from steam games in the steam recorder and things like that. i understand they can be slowed down to a fraction of the playback speed, say, by a factor of 16.

question -is there a way to take a non-blender-scene video, put it into blender, speed the video up and use the extra frames as motion blur? it would be like rendering a scene several times each frame for full-sample motion-blur, but with an external video.

if there is no plausible way to do this in blender, what other programs could i get a similar effect from?

You need optical flow technology. Try aftrr effects

no, i dont think you fully understand sampled motion blur.

what happens is it renders several sub-frames, a specified amount of samples, and combines them into an image to create blur that looks really nice, but is in fact not blurred at all, its just a mashup of slightly different frames. what you mean is vector blur, and that is not really what i want.

is there any way to achieve this in blender or some other free program, with only a video file?

That is an echo effect in after effects. Remember that you only have the recorded frames to sample, therefore you don’t really get better movement. As you cannot interpolate between frames. If you slow it down by a factor (?) of 16 that is seriously slow, you will still have a strobey sort of look. You could stack the same video on top of itself and mix between it and other iterations. There is a speed effect in the VSE to slow the clips down.

ok, i think the miscommunication is continuing, so ill explain the entire thing.

In the programs i use, like demo recorder in some games, it records motion key-frames 24 times a second. i can then review the footage in the game engine, as it does not actually record video. i can also slow it down (/16) so it interpolates the frames and has motion in between the keyframes.

after i record the result, i will have a 24-60 fps file that is 16 times slower than the intended actions in the game, but still has some nice motion. adding motion blur to that, at 16 samples, will bring it back to regular speed, but give it really nice motion blur.

im not looking for tricks like echo effects or ghosting, im asking if you can use something like blender’s sampled motion blur, but instead of dividing the scene into sub-frames and combining them, it uses the existing frames of a video to achieve motion blur that is actually real and not based on tricks.

and to finalize this explanation, i do not want to use something like after effects, but that stacking videos on top of eachother might work, if i can condense the framerate and still include all of the frames.

ok i figured out an effective way to do it, thanks 3pointedit for your video-layering idea, i built upon it and got some nice results.

Any chance you could post a few seconds to demonstrate outcome?

yes, ill post it on vimeo within a day and put a link up here. while we are talking about this, you might as well check out the stuff i have already made. on youtube -

Kerog6 is talking about frame interpolation.
You have frame A and frame B at 30 fps.
You let the software calculate a frame between A and B and end up with 30 frames per two seconds :wink:
It plays as slowmo depending on the interpolated frames and you’ll have a blur effect because the frame interpolation can´t produce sharp images.

EDIT: Hmm. I re-read it three times now. And each time I got less sure what he means :smiley:

arexma, you could not be more wrong. but fortunately, i have already found a solution

(I had written an incredibly detailed response to this, effectively explaining the whole process, but i screwed up and pressed the ‘reply to thread’ button and lost it all. will i try to repeat it? maybe.)

So i start out with, say, 120 frames, as in my actual test. it just so happens to be a fifth the speed i want it to be, so i must make it faster. i have a choice of an incredibly simple click to create a ‘speed control’ layer, or i could make an incredibly complicated setup to add some realistic motion blur in as a side benefit. which one will i choose?

you know which one.

i start by making five layers of the same video in the video editor, each one at 20% opacity (explained later). I then offset them by a single frame for layer 2, two frames for layer 3, 3 for layer 4, and so on until i have the factor i want to speed the footage up by (5).

now, if i go to frame 5, i should see a combination of frames 1, 2, 3, 4, and 5 at the same frame. this is because five 20% opacity layers combined make a 100% opacity composite layer. i could then go to frame ten and see a combo of frames 6, 7, 8, 9, and 10. you should, by now, get the idea.

setting the step (skip frames) value to five (or, 4, probably), making the end frame at 120, and rendering at 24 fps will give me a one second video, sped up 5 times from the source, but also has 5 samples of motion blur.

so, in short, its not frame interpolation. i dont need to interpolate, because i have an ACTUAL slow-motion video. that was intentional from the start.

it is actually the opposite, frame COMBINation. i think that sums it up well, dont you think?

also, in a one-two punch to finalize my epic solution, i have the video set up on vimeo. its right here - - go crazy.

remember - watch video, subscribe to my YouTube channel, use blender, ???, profit!

I don’t get it. Did you just change the frame rate? The clip still takes the same time to play doesn’t it.

no, check out the vimeo link. it plays faster but still has all the frames, at the original framerate.

I think you are splitting hairs here.

Lets take a look at frame #311 from your work.

To me it does not look like motion blur at all. Just a bunch of stacked images. And some of the images precede the original which is not how motion blur should work. As David Byrne said “Just look where my hand was”.

When I want motion blur I want it to be smooth. I have to agree with 3pointEdit. After Effects would certainly do the trick and probably look better.

But as long as your happy:rolleyes:

i dont understand your dislike of the result. the entire thing is an original image, it happens to have exposure. thats how motion blur works in real life.

the ‘original image’ you are probably referring to is probably that peice of gray in the middle of the cube, but thats just where all 5 layers have the same color - motion blur preceding the image is only a bad thing if it does not match up with the actual motion, which is far from what im doing here.

once again, i feel i have explained it perfectly, but i dont think you are thinking about the same effect as i am.

and to clear it up once again for, what, the seventh time, motion blur is just a bunch of stacked images. the only reason it looks bad in the example is because - 1. i probably used the wrong step value when rendering it, and - 2. i only have 5 samples, where in professional studios and videos they either have adaptive algorithms, or use hundreds of samples.

Atom, the shutter on a camera opens as well as closes so you get blur preceding and post…
Kerog, why not just use blender’s “sampled motion blur”?

for doing it in post 3point edit is right that optical flow is a great solution for tweening… you don’t need masses of frames (equivalent of shooting on hi speed film) as the generated motion vectors can be samples as many times as you need on a per pixel basis to get a super smooth result.

you’re talking about frame blending
Use the “speed” control in blender’s video sequence editor does frame blending as an option… great for speeding up slow footage as long as you speed it up a lot… terrible for slowing stuff down because you get that strobe effect where there simply aren’t enough/any tweens…

and to clear it up once again for, what, the seventh time, motion blur is just a bunch of stacked images.

no that’s just a fake… real motion blur is a continuous artefact as you capture light on the surface of film over time.

Hmm, a big chip on your shoulder… you shouldn’t blame people for trying to help you, especially when they are right!

If it plays faster then the framerate is not the same. So you are blending the original number of frames together at a new, faster framerate.

Mike, I need to borrow some of your clarity and foresight.

To hopefully make a final comment, I needed this motion blur for a game recording in fraps, for a video I have since revised to be entirely in blender. The blender made video was an unrelated example.

And while we are on the topic, thanks for the speed controller tip. You have now made this entire thread before your comment useless, but atleast I now have a better understanding of post-processing.

As for 3pointedit, i was probably thinking that he was trying to collapse my methods for the frame-blending process, but I now see he was trying to correct my errors in the theory of the whole thing.

Just to be a troll, however, I have two things to add. 1. I think saying blur is stacked images is a more practical blur theory, but I do understand the exposure artifact method being more accurate.

Oh, and 2. 'Artifact.