Blender Crash During Compositing ... Low Memory Survival Guide Needed

Hello.
Thanks for reading.

version 2.58 (sub 1), revision 38019. Release
build date: 1, 21:51:25
platform: Windows:64bit
RAM 4gb RAM
704,00 Vertices … 680,000 Faces … Objects 390 …
Memory 138M (7.38)
(does blender output this data to text view somehow so I do not clumsily type it in)

During Compositing it appears that Blender.

  1. May use a stale image Render Layer with No warnings.
  2. Crash as stated by the OS.

I get the crash on second frame and worse a crash after about 170 frame.
In the case of frame 170 that is more of a waste of time.
The target type is movie … avi codec.
I think I got quicker failures with image sequence.

Many Related questions.
I assume I have low memory problems from casual observation of Task Manager.
Does Blender warn of Low Memory Problems?
What are some techniques to remedy Low Memory Problems for Blender?
Your ideas, comments, criticisms are welcome.

Here are some things I have tried.

Standard Good Review and Practice of Some Nodes (perhaps)
Render Performance Panel … Free Unused Nodes
Include less pass information (Object ID, Depth) if possible.
Use Particle Systems for Duplication of Objects if possible.
Shut Down other apps on machine if possible.

Temporary Triage
Reduce the % of resolution for the render in pixels of beta versions… 100%–>>25%
Eventually I must go back to 100%

You’re on the right track. Alt+D will instance objects reducing memory usage. One of the problems with memory usage is OpenEXR. It converts everything to 32bits per channel. e.g. an 8bits per channel RGB .jpg will be converted to 32bits per channel RGBA when imported into Blender. I asked a developer, several years ago, about adding an option for truncating this to 16bits per channel in order to resolve some of these memory issues because this is more than adequate for most compositing needs. I was told “no way” since this would basically require a recode of nearly everything in Blender in order to choose between the 16/32 format.

All of this can be particularly aggravating when trying to work with high def images in compressed format like .png of 5-10MB each, which were used in order to limit memory issues, only to find that Blender is going to come down like a house of cards when you try to stack up more than a few of these in the compositor.

I have no idea why Blender has such memory issues when other compositing/imaging softwares have been able to, for many years, use disk caching in order to work with hundreds of megabytes. I think After Effects can even work with images of up to 30,000 X 30,000 pixels. Try that in Blender and watch what happens.

This is why Blender doesn’t have an IMAX preset.

I wonder if the Compositor recode addresses this issue? Blender has always had memory cacheing issues, with different tools using different techniques.

The target type is movie … avi codec.
Perhaps this is the culprit? Have you tried just rendering to an image sequence? What is you output resolution? Some AVI codecs only work with certain output resolutions.

If you have a dimensions mismatch (the codec spits the dummy) the render will generally hang immediately.

Atom
I did try movie sequence at one point … they also crashed.

Thanks for all the responses.
I kept the main effect nodes.
Just to get a render I removed many nodes which had subtle stylistic effects.
Thus I reduced nodes.
Since I do NOT see

  1. obvious explicit comment capability
  2. Compile Time execution path selection (if)
  3. Keep Dead Node As Comment
    I did lose some information in the the particular node set.

I wonder what would happen if you route successive node scenes through each other? That is duplicate the original large node group into another scene, then delete half the nodes from each. Only thing is, I’m not sure that you can get render results from one scene to another?

But this may indicate memory usage within a scene or node window.

Your work flows might have to involve many intermediate steps, each one of which reads from one set of MultiLayer files and outputs to another set. Yes, sometimes it can be awkward. (Being a computer jock, I use Unix "makefile"s.)