blender not using my memory?

This is my machine:
Corsair CMPSU-550VX 550W

Intel Core i7 920 2,66GHz Socket 1366 Box

Fractal Design Define R2 (Svart)

Western Digital Caviar Black WD5001AALS 32MB 500GB

ASRock X58 Extreme

OCZ Platinum XTC LV DDR3 PC12800/1600MHz CL7 4x2GB

Leadtek GeForce WinFast GTX 260 (216SP) Extreme+ Dual-DVI 896MB

its awesome and all that, but at high poly scenes blender seems to fail, as soon as i go over a million faces it gets laggy.
the render-process is really fast, except for the part where it says “preparing scecne data” that takes forever. even so, watching the resource manager, i found that using blender NEVER takes up more than 2-3 gigs of RAM, considering i bought 8GB only for the purpose of making highpoly stuff it makes me really bummed. is blender constricted in using RAM?

You’re not running on a 32-bit build/operating system are you? They usually cap the memory usage at about 3GB

Are you using 2.49 or 2.53? Which operating system?

I assume it is 64 bit?

When you say Blender gets laggy, you mean the 3D viewport?
If you are using 2.53 try some different Window Draw Methods.
User Preferences -> System -> Window Draw Method.

As for the ‘low’ RAM usage, I don’t think Blender can use more (or less) than it needs to render the scene. I can render two million polys on 2GB RAM

yes, im on a 64bit system.
thanks for the tip about the draw method, i think i got it running a bit better.
about the reders though, the problem is that when the “preparing scene-data” is displayed, the ram memory fills up reallyreally slowly, and stops at 3GB and then it seems like nothing is happening for quite some time, and then the scene renders.
i was wondering if there was some kind of limit that can be changed?

If you got a 64b OS and run 64b blender there is nothing to complain about. You can use all the memory you got in your machine.
Be glad the scene only uses 3GiB you can start to worry if Blender crashes on rendering when it runs out of memory =)

My record was rendering a scene for a poster 3 by 6 meters in 150dpi using up ~12GB memory at peak.

And for high poly modeling you don´t really need lots of memory, you need a powerful graphics card for the realtime viewport rendering.

ok then, i had a feeling i got something wrong, i guess ill just have to wait for the renderings.after all, even if my rig cost a big buck it aint no bigass renderfarm. thanks peeps!

try baking a fluid simulation at a resolution of 350 or 400
that eats a ton of ram but it’s what you need for really realistic results

wow, yes, that filled up all the memory quite fast, well, actually it was instantly.
but it couldnt handle 512 resolution for smokesim, it died.
but 400 was ok, baketime was a hoe and rendertime was a hoe.
by hoe i mean this:
http://thebsreport.files.wordpress.com/2009/07/hoe-1272.jpg
and result was ugly so i wont post. but it proves that blender can and will use all of my RAM as needed. thanks people.

There are various settings, such as “octree resolution,” which you may need to investigate. These control the building of various data-structures which Blender uses to, for example, locate the objects that a particular ray might intersect.

Nevertheless…what is euphemistically called “cheating” is in fact a critical survival skill. There are lots of ways to set-up a problem that might look stunning if you had an unlimited amount of computer-time to throw at it, “again and again and again.” But if you break it down, composite the hell out of it and shoot just for “what looks good enough,” you can be rewarded with dramatically faster results.

yea
you have 8 gig of ram which is what I have
about 350-400 is the max you’ll get on a fluid sim
less if you have a lot of particles
but it make a really pretty picture :yes:

edit: BTW, get a build from graphicall.org that has OpenMP enabled
it will give you at least some degree of parallelism on the bake process which will speed thing up a lot with your multi core cpu