Maximum number of faces at render? 20,000,000 causing crash!

I’ve been trying to render a scene that has 20,000,000 faces, but Blender crashes when I try to render. (2.37a Win XP 64 bit 4GB ram dual Opteron 244’s.)

I tried with 10,000,000 faces but it still crashes.

Is there an upper limit?



Typically, Blender will crash if you don’t have enough RAM. But I tried to see how many faces it will take to use the amount of RAM you have, and you should be able to, in theory, render it with no problem. So there probably is some kind of limit in Blender to the ammount of faces that you can have, instead of simply a RAM limitation.

However, I haven’t taken into account material properties, textures, or any of the wierd stuff that goes into rendering calculations that needs RAM. So even though the amount of vertex memory needed is less than 1GB, there’s still everything else required to render all those faces. Check Blender’s memory usage when you try and render and see if that’s the problem.

Why you want to render a scene with 20 million faces is beyond me.

Blender’s mem usage in the main window is 340Mb, no idea at render as the render window opens, but then crashes before any info is displayed.

Scene has no materials, no textures, one sun lamp with ray shadows & no AO. Samples at 5.

Just lots of dupliverted Suzannes.


Is purely a system benchmark test. I’m trying to see what my upper limits for modeling and rendering are.

Was quite surprised that there are 20,000,000 faces tbh, but there’s marginal fps slowing in the 3D windows whilst panning, with the dupliverted objects shown as a mesh. No slow down at all when displayed as Border boxes.


Well, how big is that scene(dimensions wise) I’ve had Issues with massive scenes in that past, like anything past 10,000 units from the center and blender slows way down

There is a Limit of 2go For Blender, dunno if it apply to you in 64bit. If you have a lot of object it may explain the bug though, try joining all object.

Fonix, I think I had to extend the viewport clipping to 1000, so you might be right. I’ll do another test today and see.

Gabio, what’s the 2go limit mean? I’d need to convert all duplis to real to join them up. I’ll have another test with this method.

Thanks, any more suggestions, advice?


Actually I think you are running out of ram.

What is happening, is that render code is really “dumb” (ie doesn’t do any fancy stuff to reduce memory usage, not in a pejorative sense) right now and is making every face real prior to rendering (discovered via experiment).

Also the renderer setup takes 1.17 times the memory of the real mesh.

so combined memory requirement is 2.17* the memory requirement of the objects if they were real mesh.

Which comes out to roughly four GB in the 10,000,000 face case.


In Windows XP (32bit) one process can take a maximum of 2GB ram. This doesnt aply to Win XP 64bit, which has no limit (or 4GB limit… I don’t remember :P).

Just a thought you might consider. I had big troubles with blender crashing, and later found out that it was a hardware memory error that was not picked up by the BIOS memory test or other applications (because mostly they never used that much RAM). I finally downloaded a dos based boot disk and ran a memory test to find I had a dud chip. It took me a while to cotton on to the problem because only Blender was crashing, so I wrongly assumed it was Blender that was the problem.



Your problem could also be the fact that you’re using WinXP. It does memory management strangely.

Might want to, if you haven’t already, try setting your swapfile to a set size (generally double to triple the available ram, or 8-12gb for you – don’t let windows manage the swapfile size). I’d also recommend setting the swap drive to one other than your C drive; this does make a big difference sometimes.

The other thing to check would be to see if Blender still crashes under linux on your machine; you could test it easily with a bootdisc from the Knoppix distro.

Thankyou all for your help.

I checked the scene size today, and it’s 4 times as large as an unmodified cube primitive, so not huge.

I decimated Suzanne to 1.4 of the faces I had and managed to render. There were only 200,000 faces though. When I then applied level 1 Sub Surf, it crashed at render.

Letterip, I think you’ve hit the nail on the head. Can I render from the command line, to force Blender to use the hard drive as well as the ram?

For Maxwell render there’s a memory bypass switch which only uses the HD to render. (-hd instead of -d)

In theory, I should be able to render approx 8-9,000,000 faces before Blender crashes at render, if 10,000,000 gives a memory requirement of 4GB. What do you think?



I decimated Suzanne to 1.4 of the faces I had and managed to render. There were only 200,000 faces though. When I then applied level 1 Sub Surf, it crashed at render.

How many suzanne duplicates? Each subdivided one will give about 800,000 faces quads, or if it is tries - than about 600,000 faces per suzanne. Confirmed with ton that it is in fact converting everything to polys, see the bug tracker for details. I left links for him to bucket based rendering which if implemented allows billions of polys.

I agree that 8 million faces would probably be okay (depends on how much memory texture is taking) are you sure it is crashing? Or is it just swapping forever? (And thus more of a ‘freeze’)



In Windows XP (32bit) one process can take a maximum of 2GB ram. This doesnt aply to Win XP 64bit, which has no limit (or 4GB limit… I don’t remember :P).[/quote]
I think in XP64bit the limit is 16gb of ram, total, with 8gb per process, but that’s only with the special 64bit version (you have that, right?) So I don’t think that this is a ram issue. Just another reason we need a render core rewrite!

Without having the scene open in front of me, I think the plane I used for the dupliverts had either 16k faces or 66k faces. I think it would be 16k though.

Saying that 16,000 suzannes is a lot of faces. 16,000x500=8,000,000

I had a Suzanne with 2,000 faces as I subdivied once instead of using sub surf.

So with that @ 16,000x2,000=32,000,000 faces at render. Hmm, that’s slightly more than the 20,000,000 I had.

Ok, so it’s going to be 20,000,000/2,000=10,000 Suzannes which means my dupli plane has 10,000 faces.

I’ve not used any texture data yet, so that’s not an issue yet.

It crashes like this; I press render, the render window opens. Before the first sample level is calculated, I get a Windows program crash message, with the Bill Gates send/don’t send option. Either option closes Blender.

I’ll copy out the error report next time I try it, if that would be of help.


This is why I went straight to 64bit. I can bolt another 4GB of ram into the system without any issues. Max I can do on my current board is 2 4GB banks, 1 bank for each CPU.


what would be best is to run a debug build, with a debugger, then you can do a back trace.


I don’t have one of those mate,



Do you Render by parts?

For my own, I have a computer with 1.5GB and two others with 2GB of ram, but when I render scenes requiring about 700MB of Ram (about 3 000 000 faces) I begin to render by parts, in order to keep the memory needed amount under 700 or 1000MB (half size of my available)memory.

This said, I have rarely tresspassed a face count of 5 000 000 faces!

I have done the latest animation renderings with 2.40 alpha 2 and I have encountered crashes at random after a variable number of frames, but this may be an other problem…



Indeed render by parts seems to greatly increase the number of polys you can render,

just tested 20,000,000 faces on my puny ram laptop with 8 x 8 parts and it worked fine (still swapped a little bit…)