How much ram for 30m tris?

I wanted to do some operations on a 30m tris object with 16gb ram and it fills up fast.
operations like import/export - applying decimation with modifiers and such.

my question is people with 32 GB or 64 GB of ram, does 30m tris fill those up as well? or are they a safe bet to invest in?

If your system can accommodate 64 gb of memory and you can afford it, max it out.

Just to be sure I made a quick cube derived model, subdivided to 80k, then multires’ed to 21m quads(or twice that if we count the tris) applied and sculpt mode and object mode were still reasonable.

But as soon as I tabbed into edit mode, sure enough it shot to the moon. Used up all 24 gigs of memory I have and still didn’t have enough.

1 Like

did some tests with my LTS (2.93.4)
ram still fills up with 8m verts.

BUT This is not true in Blender v3.0 alpha. It leaves about 2 gigs of breathing room (on 16GB of ram)

@Felix_Kutt Thanks for the test and advice. :+1: I think I replace my 2x8GB of ram with 2x16GB of ram so in the future I could easily upgrade to 4x16 GB of ram. I think that is a safe bet.

Still would love to see some tests done on 32 and 64 GB of ram.

I have 64 GB. In my current project, i have a few objects that are over 10million in sculpt mode (maybe close to 30million) uses under 50%. I’ll double check this evening.

Rendering all of that is a nightmare though.

No need to replace them with the 2x16. As you have just 2 of your 4 slots used currently. Buying 2x16 and putting them next to the existing 2x8 is just fine. So you will have 48gb until you do your second upgrade later.

2 Likes

Doesn’t he need to make sure the timings between the RAM sticks are compatible?

No, not needed to make it work. I mean you might not be able to use the few % performance gain of the new higher timings, but the additional 16gb will be much more helpful.

1 Like

I have detected something other too: When the objects are selected things crumble down. Though when deselected are manageable. It seems strange that selection makes such a huge difference.

Also when someone is in the camera view things become very heavy. This too remains a question.

And all those things in 32gb ram. Is it not strange?

There is a problem in Blender regarding the heavy meshes. It is a problem that imust be prioritized by the developers.

1 Like

I was talking about stability. You can have stability issues if you run RAM out of spec.

I did not know that, thanks.

@Chuk_Chuk I can underclock all the rams in my setup to the weakest piece so that should be fine?!

so you suggest I should hid the mesh then apply modifiers? Is that even possible? have to test that.
Edit: Although hiding it did not change anything that much, disabling overlay helped a bit before applying decimation on 12m tris
disabling view port option does not let you add any modifiers but it lets you apply them and there was less choppiness.
Thanks for the tip.

It is both RAM speed and CAS timings. That need to match up

I cannot gurantee that just running at a slower speed is sufficient. This is more of an FYI, incase you do notice stability issues. Worst comes to worst, you pull out the older sticks and just run with the new ones.

1 Like

Thanks I’ll keep that in mind.

Dont expect too much of a problem here. Yes for its completeness you might theoretically run into such issues, not just that likely. And if it would happen you would still have some options to tweak it, but currently its just buy it and test it together.

Dont forget to check your mainboards manual for supported ramtypes

2 Likes

Lower speed &/ timings between the two sets are automatically used. Especially on intel systems this should be no significant issue, and even on AMD the difference will be fairly marginal.

edit: wanted to clarify, for workloads such as rendering, capacity is more important. Speed and timings aren’t as much.

4 Likes

So I just checked with my sculpt. Upon loading it in Blender uses around 13GB of RAM. RAM requirement does rise as I sculpt due to saving information for the undo action, it got up to 17GB in my short test.

For reference my model has 7 objects, Three of the objects are above 30 million polygons.

1 Like

From my experience dealing with the extremly big meshes can be very different depending on what you are accually doing with them. I have two examples for you @Weekend.

I have one scene with one 3,7mil poly mesh with 3 levels of multires on top of it. After opening, the scene takes around 36GB of RAM no matter if the modifier is enabled in the viewport or not. After enabling viewport preview for the second level of multires the counter shows around 19,1 mil faces. Unfortunately I cant tell how much the third level takes, because Blender can’t handle it, and crashes. If I want to render this scene I need at least 128GB of RAM and another 128GB of swap memory. During rendering preprocessing the RAM usage spikes substantially above RAM limit. This scene takes 3,8 GB of disc space with compression, without it’s more like 7-8GB.

Second example. Scene that takes only 30MB of disc space, but after opening - 14GB of RAM. In this scene I have two big meshes. First one is a plane with heavily modified Ocean modifier - this mesh have around 7mil polys with the modifier on. Second mesh is only 1mil but with 4 levels of subdivision on top (subdiv is used by displacement shader). Together this scene can take pretty much the same amounts of RAM during rendering as the previous one.

Those are only an examples that even if you have relatively low poly mesh you can hit memory limit pretty fast. I’d second the advice about maxing out your RAM.

2 Likes