64gb ram as memory

has anyone ever considered building a system with this much ram before…

if u have share ure experience…

what part of blender is more likely to use this much ram… imagine the massive water simulations anhd smokes we can do or is blender far more efficient in using our systems resources

is blender optimizaed to use all the ram in the system…

whilst on the subject whats happened to implementing

is it about time for blender to increase blenders memory limit…from 8gb to maybe 16gb or beyond…(unless this is already done/being worked on ) inorder to match up with todays memory capacities… should this implementation come with the new sculpting branches

I have 48gb, and a 64bit build of Blender has no problem at all using all that memory. I have tested scenes that take up 38gb and more. Copying 8gb of mesh data takes a couple of seconds (understandably), and you have to be careful how you setup the view when dealing with 100 million poly scenes or more (opengl viewport may crash in wireframe/solid mode).

And I do not understand your comment about Blender’s “memory limit” of 8gb? There is no such limit, as far as I know?

I’ve definitely considered getting a lot of memory in the new system I am going to build. Maybe not quite that much.

@Herbert123: How did you manage to get your i7-920 to play with 48G? Intel’s web site says it can only address 24G. (I realize thats not always the truth, but it does seem heavily dependent memory vendor, etc.)

There is no 8GB limit. I have 12GB here, it can definitely all be used.

Sort of an accident - I planned for 24gb, and purchased 24gb in 3x8gb modules (while I should have gotten 6x4gb). Although 'officially" 8gb modules were not supposed to work on my main board, they actually did.

Next, I purchased a second batch, just for testing purposes and out of curiosity: an obscure reference by a Japanese guy mentioned that 48gb seemed to work on an asus p6t deluxe SAS board. And it does! :slight_smile: I read it might depend on the individual X58 board, and some people only got 32gb to work.

Note from another thread on HardForum:

Some dual 1366 boards officially support 288GB of memory and I doubt there is that much difference in memory controller design between different 1366 processors. Other than setting some crippling difffently (e.g. disabling support for registered memory and ECC).

I suspect the reason it isn’t officially supported is because 1366 was being phased out just as 8GB desktop modules were coming onto the market so intel and their partners didn’t think it worth officially testing and qualifying the configuration.

win x32 programs I think have a 2GB limit…which usually makes blender crawl or stop @1.5-1.8GB…I had a workstation a few years ago that continually crashed blender on large scenes…but that is all old news…64bit os’s have none of these issues.

That wouod be awesome but the only problem is you couldnt have a six double core i7

Hi, 64 GB would definitely be best for simulations, and of some use for rendering. Chances are though that you won’t be doing complex enough renders to take full advantage of the 64 GB.

Slightly off topic - anyone tried zram on linux - curious if you could do much larger sculpting/simulations with it enabled.

That actually surprised me the other day, the fact that CPU’s actually have a limit for RAM? I should know such things, hehe, working with a lot with hardware during the years. I always thought it was the chipsets that were the brake when it came to RAM… Weird, the things you miss. ;D

Well as Herbert123 clearly proved its not a hard limit, you do have to be lucky when it comes to your particular hardware components though :frowning:

Well, there are chipsets that allow >16Gb RAM on socket 1366 but there isn’t on socket 1156 which I run in this box. So for me I gotta switch MB + CPU to run more. But I don’t panic, I did 3D before we had even 1 Gb in our machines, there’s always workarounds, hehe… ;D

what optimizations do u use, or is this down to opengl limitations…as my viewport slows…down…the most my blender wich is 2.61 from blender .org uses is 7GB afterwhich the blender window blacked out luckily had blender console running
PS i’m using 64bt windows 7 home premium
I used to compile my own blender in linux ubuntu until pyopencl was installed which in turn broke my ubuntu system

Did you do this to get custom patches, or just to keep up with SVN?

If it was the latter, I would like to suggest this PPA for Ubuntu : https://launchpad.net/~cheleb/+archive/blender-svn

Pretty much nightly builds of Blender SVN.

are these optimized

I honestly don’t know which options they were compiled with, but I did an optimized compile a while back and didn’t see any noticable difference in my build vs the one from this ppa. Of course that could have changed since. Only thing is this PPA doesn’t have CUDA support built in, only OpenCL as far as I can tell.

how can i determine if my version of blender is static or dynamic or is this nolonger the case

JeMalloc is used for our official linux releases (freeBSD has it default),
someone needs to get this working for windows, osx still.

this is off topic but would a solid state drive speed blender up any.

64 Gb ram would show life like water simulations.
I have 32Gb ram, to which i did a water sim in blender however it didnt use all of it.
I totally forgot how much was used but it may have been over 10 Gbs.

Here was the result, i rendered it in octane:


do u use blender on linux if you do, do u notice any differance in the memory usage, since which version of blender has this been the case

@everyone on this post…are you all using linux…are you compiling your own…or just downloading the latest release from graphicall…
if your compiling your own which settings in scons/cmake are you using