I’m not sure I’m in the right forum, but my problem seems to fit here.
My problem is that I am doing some rendering with big QuickTime 2D animated textures. It works well (Windows, Linux and MacOs) when the resolution of the texture is under 2000x2000 pixels. For some unknown reason, somewhere between 2000x2000 and 2100x2100 pixels, under Linux, the texture just disappears.
My problem is that I absolutely need a 2700x2700 pixels texture. This is the smallest size I can use in order to minimize the “pixelization” effect during some closeup shots.
I work on a Mac, and on this machine, I have no problem with the texture. But because I am working on a really big project, I have setup a small Linux (64bits) renderfarm to render my frames (using Loki). Now, on Linux (even without Loki), like I said, the textures just disappears. I just tested on a windows 7 64bits computer and it works ok.
One last note, that might be important, in order to render under Linux my other shots with smaller texture, I had to use two quicktime textures. One RGB and a second one with the Alpha. On my mac (and on Windows) i can use the video + alpha coded inside a single file. I don’t know if that can be a good hint.
I hope someone has a clue on what is happening, because sadly, I cannot render on my Mac because of the size of the video and the time to render… That is the reason why I did my renderfarm on Linux.