How to export low res video without blurring

Hi all,

I’m trying to export a video from Blender in low resolution (480x360) with anti-aliasing turned off for that nice low res, jagged edge look — like something from an old videogame console. When rendering still frames it looks fine (I can zoom in and see every pixel). The problem is when I export it as a video the end result is blurry and compressed looking and I can no longer see clean lines and individual pixels as well.

Here are a couple of images to illustrate what I’m talking about.

^ The image on the left is from a still render of a single frame. My goal is for every frame of the video to look like this. Instead, each frame of the video looks more like the one on the right (this is a screen shot of the .avi).

What is going on here? How do I achieve the look I’m going for?

Thanks,
Bill

It could be video compression.

Try using File Format AVI Raw.

It could also be the video player upscaling. What Video Player are you using and are you doing fullscreen or upscaling somehow.

Thanks for the reply, Boder.

I’m getting the same result with whatever video format I try, including AVI Raw.

I’m using VLC to play back the video. Not sure about your upscaling question, but I’m getting the problem whether the window is full screen or part of the screen. I do have to stretch the window in order to be big enough to see well.

I’ve also tried importing the still frames into Final Cut Pro, but it’s blurring them as well – in both the preview window and final output.

Most video players scale up the video via bicubic. You either need to turn off the filtering on your respective player, or if you want to put it online and maintain that pixel look you need to upscale the video via nearest neighbor

It sounds like upscaling via nearest neighbor is the way to go.

Does anyone know how to accomplish this on a mac? I don’t have access to after effects, and it doesn’t look like final cut can do it.

If anyone’s interested, I was able to get this working. My solution is to render the frames as images from Blender, and then use Apple’s Compressor to convert that image sequence to video (with “resize filter” set to “nearest pixel”). Thanks for the help.