Video texture slow to render

This is with Blender 2.57 on Ubuntu Linux. I’ve set up a scene where 4 planes are facing the camera in a quad view. Each one has a video texture on it of a different video. Right now this is just a test using videos I had on my computer. I tried rendering a 1000 frame animation and it took more than an hour to do. I thought it would be faster. To speed things up, I turned off all lights, shadows, ray tracing, etc and as much processing as I could think of. It still takes about 4.5 seconds to render each frame. I’d like to get it down closer to 2 seconds per frame.

Its not using much of the CPU cores (only one or two cores at once) and the load remains about 0.3. I have the threads set to auto detect and I’ve tried fixed as well with 4 and 8 threads. This is a Intel quad core 2.33GHz with 4GB of RAM and SATAII hard drives. I would think I could render this faster and it would use more of my CPU, but something else seems to be the bottleneck here. Can anyone think of ways I can speed this up or utilize more of the CPU?

Attached is a frame of the video so you can see what I’m doing. Maybe there is a faster way to do this using the compositor or something?

Attachments


I tried it in the compositor and it can render 4 images in 1.12 seconds. it’s probably similar for videos.
In the compositor, you can hook up multiple images together and overlay them using the Scale, Translate, and mix/screen nodes.
Start by making an image to fit the size of your output (render out a blank image) . add the blank image to the commpositor -this will be your base layer
Next, add your images as input and put a scale and translate node on it. Alpha Overlay it onto the blank image and adjust. Repeat until desired number of movies have been compiled.

http://speedmodeling.org/smcfiles/amdbcg_send_multilayer.blend

did you turn off OSA?

@AMDBCG: Thanks so much for your help. That blend file helped me understand what needed to be done. The compositor is so awesome for stuff like this. This made all the difference too as now I render frames in just under 1 second, which is much more acceptable. Since this is for NTSC video, I just had to scale by 0.5 and then offset things by 1/2 the resolution 360x240 -> 180x120 in each direction and I had perfectly aligned video. Here is an example:

What I’m looking to do now is make a python script to handle the video loading and somehow pass the video arguments on the command line, although I’m not sure if its possible to have blender take arguments like that on the command line and pass them into a python script that it runs.

@Modron: I did have the OSA turned off when I was rendering the scene before.