Linux program to join the frames

Any Linux program to join the frames in a movie format? I’m using QuicktimePro in the mac but i want to do it directly in the linux.

Tank’s in advance!

You can use Mencoder which is a part of Mplayer ( )

or ffmpeg

And for gif animations you can use imagemagick

convert *.jpg anim.gif

That have, by the way, ffmpeg “inside”.

I thought Blender on linux didnt have ffmpeg support yet ?

ffmpeg was added to the linux builds of blender with version 2.42.

But even before that, you could always add image sequences to the sequence editor. Just go to the sequence editor, add images, and instead of selecting one image, select all of the images in a directory by presssing alt + a. or using the shift/control keys when clicking on files.

Do NOT use any built-in method of joining frames into animations. Sheesh. Render your output to a lossless RGBA frame, and do it externally. I mean it’s nice that there’s ffmpeg integration for simple stuff built-in, but to rely on it is just stupid. Render to frames, make animations from frames.

That said, just install MPlayer and whatever codecs you want. Then just mencoder “mf://*.tga” -mf fps=29.97 -o yourmovie.mpg4.avi -ovc lavc -lavcopts vcodec=mpeg4:vbitrate=#### etc… (mf means multiple files, it’ll add sequentially, so wildcarding works great.)

edit: Before I get yelled at for flaming, I mean “stupid” in the “Doh! Jackass!!” meaning… say you set the compression too high, or the framerate too slow/fast, or whatever… you have to re-render your entire animation to fix it, not just re-encode the video from the frames.

Thank you for the “stupid” thing
Elephant Dream ( ) simple stuff was made with
built-in method I guess
Can we watch your complicated stuff made with your simple command-line?

According to this post
Elephants Dream was rendered to pngs and then encoded to various movie formats.

In case of Elephants Dream there was a little choise, since it was rendered on a supercomputer with 224 nodes, so consequent rendering of the movie would just slow things down.

But I also recommend using rendering to images when rendering on one computer, because you can tweak the compression settings of the final video without having to rerender.

well guys, tanks! This will solve my problem.

ofcourse you can use built-in method.
it’s always good to renderout into single frames, but you can then take those frames into blender sequencer and edit movie there, and then render it out straight into multiplexed movie file…

i don’t see how it’s stupid to use ffmpeg from blender instead of from commandline?

using mencoder from command line is nice and all, but it’s a lot of switches to remember (which i always forgot) and it can be sometimes bitchy to work with. i used it a lot couple of years ago, but lately , the versions have been failing me, dropping frames and resulting out of sync movie.


He already said “thanks!” so I would have considered the thread closed, but since you can’t seem to comprehend such a basic comment, and quote me directly, I shall kick the idea until it sticks in your thick head. :smiley:

If you render your animation into a lossy compression movie format from inside Blender, you are locked into that output as your only copy of the animation. If you need to convert it, it will get even uglier. And editing becomes a chore when you have to break it apart to frames to rotoscope something in, just to re-re (re?) compress it. No thanks. Render to lossless single images.

Granted, it depends on what you do. If your idea of animation is a little 900 frame thing at 640x480 that only takes 45 minutes to render, hey, it’s only a couple hours lost having to render it fully, twice. But if you’re doing animation… complex scenes, long render times, large images… and you render to a lossy compression movie format directly…

Then yes, I stand by my statement. It’s stupid.

If you cannot remember command line options, no big deal, no one does, there’s no need to unless you’re working with it daily… just do mencoder --help or man mencoder like the rest of us. :smiley:

edit: obviously someone thought it would be worth the manpower to bother linking in libavcodec… but I don’t see the need. Guess that’s what OSS is about. See a need, fill a need, Rodney Copperbottom.