Animating Textures

I’ve tried asking this in the textures forum, but am not having much luck. Maybe wrong forum, maybe poor question wording. Let me try again.

What I want to do is have an object with a texture image that I can change as an animation unfolds. I’m not really looking for a movie, more the ability to hold one image for an arbitrary number of frames, then use a different image, etc. I will likely go back and forth between images.

I have managed to get materials assigned to objects and I added an Image texture to the material, and am using an Image sequence for my source.

In the Image sequence buttons there is a parameter called “Offs” which controls which image in the sequence is used. I think that I just need to create an IPO for this parameter, but I can’t see any way to do it. Pressing ‘I’ in textures or materials button windows gives a list of keys, but “Offs” is not among them. I don’t even see “Offs” in the list of values when I look at at a Texture IPO block.

I have seen references to keying “Offs”, so I’m pretty sure I am on the right track.

Any help would be greatly appreciated.

I have seen references to keying “Offs”, so I’m pretty sure I am on the right track.

Or not.

Am I completely barking up the wrong tree here? The wiki seems to indicate that you need an image in your sequence for every frame, there stop light example supposedly referenced over 100 images inorder to place three distinct textures. That’s insane.

Is there a workaround or a scripting solution to this?

One solution that I have dug up is to load my different images into texture channels, and then key the Col and Var so that the channel for the image I want has those at 1.0, and all other channels are at 0.0. I could probably cook up an IPO Driver to select between the images as well.

Tedious doesn’t even begin to describe that one.

I suppose that an alternative would be to attach the textures to empties and then just snap in the empty that I want at a particular time (and hide the others).

This seems slightly less tedious, but still an awful lot of work for what is really a very simple operation.

Is there a better way to approach this problem?
Am I explaining what I am trying to accomplish clearly?

Well Andy, so you aren’t having this whole conversation with yourself I will pop in with what little I could find. It seems the wiki and any other references I have seen, would require your animated texture’s movie file or image sequence to have the same number of frames as your overall animation. So if the texture needs to stay the same for the first 50 frames and then change to a new texture at frame 51, your movie file would need the same image on the first 50 frames and then the new image starting at frame 51. You should be able to set this up quickly using the VSE to repeat the images over how many ever frames you need.

Thank you, I was starting to feel a little Sixth Sense.

By VSE, you mean the sequence editor, correct?

Can the output of the VSE feed a texture, or I do I need to blow the VSE out to a set of images, then use those in my main animation?

Thanks for suggesting the VSE, I probably wouldn’t have thought of it for that.

Yep I meant the Video Sequence Editor in Blender. You can easily load a sequence of images and then strech them across how many ever frames you need. I don’t see a way to load a strip directly from the VSE into a texture channel. My guess is that you would need to render out the sequence and then load the rendered video as your texture.


That is odd, I simply assumed that the VSE could feed back into the texture area. Seems like texture needs another selection in the dropdown like “VSE Strip” or something…?

The VSE works on the whole image render, and does not have any knowledge of an individual object’s texture.

There are a few ways I can think of, with advantages/disadvantgaes:

  1. Multiple texture channels, and Ipo the Col to make each one appear
    1a. After making the channels and Ipo, add a driver and Ipo that to drive the textures
  2. Make a movie, and use it as the Image texture, and make the movie be the images you want when.
  3. use the Object and UV-Project node in the Compositor, with Image nodes feeding mix nodes to swap in images when you want them.
  4. Duplicate the object to multiple layers, texture each one differently, and them make them jump from hidden layers to visible when you want them to swap out.

As far as your tedium, Blender cannot read your mind. The Ipo window Texture channel type is the easiest to use in saying when you want a texture to exert influence.

Thanks for the replies. I think I’ll probably work with door #2. Although at some point I should try and wrap my head around what’s behind #3.

I don’t expect Blender to read my mind, but I was suprised to find that it has no direct support for an IPO that selects the particular image in a sequence that you want to use for a texture. The tedium refers to the workarounds required to fake the feature. It’s obviously not a mainstream requirement, if it was it would be there with knobs and levers that I had no idea I even needed.

One further question. Given that I’m going to render my texture to a movie, I poked around for a lossless CODEC, and found the Lagarith and CorePNG CODECs.

With each CODEC I am able to create a movie with the sequence editor that can be loaded into Vegas. However, I cannot seem to get Blender to recognize the movie. The Lagarith reports:

[avi @ 6A5B1640]Could not find codec parameters (Video: LAGS / 0x5347414C, 640x480)
not an anim; f:\Happy Father’s Day\Textures\Andy-Laga0001_0128.avi
[avi @ 6A5B1640]Could not find codec parameters (Video: LAGS / 0x5347414C, 640x480)
not an anim; f:\Happy Father’s Day\Textures\Andy-Laga0001_0128.avi

The CorePNG just reports:
not an anim; f:\Happy Father’s Day\Textures\Andy-CorePNG-0001_0119.avi

Any one run accros these before? I thought I might be getting Vista’d, but as I said, the files are valid, and other applications on the system are capable of opening them. It is an RGBA animation if that matters.

I think that the issue is that the Lagarith CODEC is not a recognized part of the FFMPEG library which is what I believe Blender uses to import a movie.

At least I think that Lagarith is not supported in FFMPEG, there’s a fair amount of contradictory information to be found.

I have seen others mention using Lagarith with Blender. Do you have the actual Lagarith Codec installed on your Computer? If so, maybe you need to turn on the “Enable all codecs” option in the Blender User preferences under System and OpenGL.

Yes, it is installed. Blender will use it to export a texture movie from the VSE. It’s fast as blazes, and it has specific support for recognizing that one frame is identical to the previous frame, and putting a null frame in there. 360 frames of 1K images produces a <4MB avi file (because I’m just jumping back and forth between images).

However, The VSE will not load the movie it just made and gives an FFMPEG related error message. And, of course, you can’t use it as a texture.

I don’t know if there’s a way to get Blender to use a VFW CODEC to load a texture movie. I do have the Enable all codecs option turned on, it hasn’t helped.

Here’s a final note that describes the conclusions that I have come to, perhaps it may be usefull to some future searcher:

  • Blender has no direct support for controlling the displayed frame of a texture animation.

  • I think that the least tedious way to control your texture changes is to create a movie in the sequence editor that has the texture changes that you want. If you use a separate scene for the texture sequence, you can keep your rendering options all set up.

  • I chose to export a sequence of PNG images for my texture due to a few things that conspired to cause me grief:

  1. It appears (I’m 99% certain of this) that importing texture movies is done with the FFMPEG library, so if you use a video CODEC, it has to be recognized by that library.
  2. I got fair compression with the recognized FFV1 CODEC, but I could not get it to encode my Alpha channel data. I am on Windows, and I used the ffdshow CODEC package.
  3. The best performing CODEC (by far) was Lagarith (mainly because in this application, most of the images end up being the same). Unfortunately, Lagarith is not recognized by FFMPEG.
  4. PNG correctly encodes the Alpha channel, Blender handles an image sequence for a texture very well.

-In the administrivia department, I put full size textures into the sequence editor, and used the the sequence editor to generate smaller textures that were appropriate to what I was working on.

You know, we need someone to compile all these little hard learned gems into a single anotated searchable list.