Memory limits?

There are limits in the amount of memory Blender can allocate to calculate datas? I mean, assuming your local amount of RAM and your swap disk space being unlimited to theoretically avoid any bottleneck, there is an overall value beyond that Blender actually crash or is it free to go as high as you want?

it’s 1.5 GB so it’s would be good to have have 4 GB if working with vista

on 64 bits windows machine i can find it if you need it but it’s the 20 GB if i remember well

but in any case on 32 bits nachine your also limited or the render to 10,000 x 10,000
pixel and with script this can go up to 20,000

SAlutations

mmh… I don’t understand why “…it’s 1.5 GB so it’s would be good to have have 4 GB if working with vista…” but anyway:

win —> 1,5 GB

linux —> 20 GB… really? Can you confirm that? I think I should swap definitely to linux.

might give you more details

for vista - vista needs around 1 GB + 1.5 GB for blender
so thats the reason to get 4 GB unless you want ot have lot’s of memory swapping to disk!

http://www.blendernation.com/2007/10/13/blenders-1.5-gb-ram-limitation/
http://mpan3.homeip.net/blendermemory
http://en.wiki.mcneel.com/default.aspx/McNeel/LargeAddressAware.html

good luck

SAlutations

I just had to render a 9684x3780 for a rollup with composite nodes…
32 Bit version crashed…
64 Bit version used up a peak of 15 GiB ram.
And it sucks that octree, scene prep and compositing isnt using all my painstaking cores :smiley:

And its easy… for the ~100 time:
32 bit system: can adress 2^32 bit. Windows Memory hole, so from 4GiB downwards is a hole where other devices are mappd… E.g you have a graphiccard with 500MiB vram, the ram from 3.5 to 4 GiB can not be adressed eventhough its there, because the adress range is used for the graphicscard.
64 bit system: can adress 2^64 bit which is about 2 exabyte, practically ist something around 128GiB but you cant afford those ram modules ^^
and windows 32 bit has a internal lock that programs use 1.5GB… simplified… this would go into multithreading architecture, process structure and memory managment.

You can use PAE or other stuff to extend the adress range of 32 bit systems which causes instability.

for me one thing is almost certain and it’s that the next PC i buy will be 64 bits with
something like 16 or 20 GB of memory

and hopefully it will be a bit faster may be in the 10 GHZ range in about 2 years i guess

so it should be more fun and able to deal with more cimplciated scene with lot’s of objects
hey the futur is bright for blender

SAlutations

Its quite impossible actually to get a 32Bit processor still anywhere nowadays.
And actual boards support 8GiB memory, some support 16GiB… if you want 20GiB you have to look to the high end server solution corner of the IT market…

My computer has a actual of of 11.2 GHz
According to moore´s law in 2 years it would be, based on the 12 Month moore law, where i think nowadays the 18 Month is more likely to apply it would be 54 GHZ leaving the fact alone that it surely will scale more due to manycore processors.
I guess in 2 years the average desktop pc will have 70-100 GHz power not including the FPU power of “GPU Architecture supported CPUs”