memory usage..??...!$#%*

(deathguppie) #1

I have run into a problem where blender seems to either “not render” or crash, and it seems to be memory usage.

For instance this is an image I rendered in blender 2.37a … render time was 4.13 minutes at 1280x1024 res and memory usage was a full 1500M. Yes the grass is rendered. If I try to add more to this scene blender will die a horrible death.

Here is another model that I have been working on. It was going to be a robot, but this rendering took 1200M of memory, and it isn’t even half way done!

Even if It starts using swap space, I still only have an extra 500M in there, so it’s going to die whether I like it or not

I did notice that if I switch out of X during the render it seems to hang on a little better, but I just don’t know if this is right?

Doe’s anyone else have these problems??

My system is an AMD64 3200 with 1500M of ram, running linux in 64 bit mode.

(z3r0 d) #2

are you using subsurf 6 or something?

applications, even in linux, have a hard limit of how much memory they can allocate. Your limit might be 2Gb, mine is 1 [in windows]. A high number of polygons can cause issues like this.

(deathguppie) #3

I am using subsurf 4, octree is 512 and OSA is set to 16. I haven’t meddeld with the subsurf because that seems optimal but changing the OSA and octree doesn’t make much of a difference, lowering the resolution does make a slight difference.

One thing I will say is that rendering times are nothing like people are talking about. Renders that take people hours are only a few minutes for me. The only problem is that if I run out of memory the whole thing shuts down.


There are thousands of polygons in those renders… so?

Is there a way to get rid of them?

I thought that was how this worked?

Oh… and I’m using blender internal, there doesn’t seem to be any advantage to yafray and the render times are way long.

(z3r0 d) #4

subsurf 4, of thousands of polygons results in 4^4*thousands == hundreds of thousands to millions

… in other words that isn’t much of an issue

it appears you are using static particles … that could be an issue… how many are you using?

octree setting is for ray lamps/reflection, turn it down… will use less memory but may render slower. [it isn’t a huge difference in memory usually… at least compared to subsurf]

… pretty much the only way to eat memory is polygons and high subsurf… as far as I know, I don’t play with particles much

(deathguppie) #5

Thanks for your time… z3r0 d, but it seems I have found a solution.

If I render using the command line there is no problem. I don’t know what it is… but something is killing the GUI during render and crashing blender when memory usage rises.

It may just be an issue with 64 bit systems but I would still rather just render using the command line moving larger chunks of memory around then go back to 32 bit and wait forever to get renders.

Actually if I keep on this route I will probably invest in a dual Opteron. Registered memory, server style big chunks of data being moved symetrically… That’s the shiznit for render times…

(deathguppie) #6

Just in case anyone is reading this the problem I was having was polygon madness…

I was creating whatever size UVSpheres and tubes, and the like I wanted. Then subdividing carelessly… then creating huge .blend files and wondering why my system was sucking up all of it’s memory…

The robot never got finnished… but the last render I did (it was a lot more finnised than the pic posted) used 1.5G of ram and 1.3G of swap… so almost 3G of virtual ram. That’s bad mkay!

(Marty_D) #7

heh, there’s a reason for that smooth button there in the edit panel. :slight_smile: