Okay - going batty here. I have a virtual television studio, with a big screen TV in it. I’m trying to play the show’s animated logo on the screen during a zoom in, blender producing an animation as the camera moves around the set. I’ve only done video texturing once before.
I was using blender 2.59; I used a WMV video file (the animated logo for the show) textured onto the television. It showed up and mapped out alright on the “TV screen” (which is a cube); and it would render the first frame of the video. The texture frames were set to “Match movie length,” which blender accurately recognized as 599 frames (the blender animation itself was only going to be 550 frames).
When rendering the animation in blender, it would not play the video - it would only paint the first frame of the video on the TV screen.
I then rendered the logo in AVI format and tried again. Same thing.
I then went back to my original logo animation (produced years ago by a completely different program, but I had all the still images) and imported as an image sequence. Same thing: It would only display the very first image, no others.
I then downloaded blender 2.61 and tried again - same thing.
What on earth am I missing? The frames are correctly set in each case, the image shows up with or without loading the image or video into the UV map.
Oh - and as I move the time slider around on the dope sheet, the texture image does change, showing that blender is indeed recognizing what image or video frame needs to go where - it’s just not outputting it!