68 GiB Ram for hair

I have a model with one million hairs, I believe. I don’t know how to find out how many particles a model has. I painted them in and added some children. So I guess he has 1-2 million hairs? When I try to render him in Cycles my computer runs out of memory, because it needs 68 GB. I have 6 GB VRAM and 32 GB RAM. It renders fine in Eevee. But Cycles is impossible.
This is him:


Is there a way to cut the needed Ram?

Does it help if you change the device to cpu?

No. Because I have only 32 GB of RAM. The closer I go with the camera the less memory it needs. But if I need a whole bodyshot it’s impossible.

Are you using (hair) children for hair or all are straight hair strands?

1 Like

I just render as test 2000000 hair no children in my 12g ram in Cycles just for testing, it went well. If you cannot share the blend file then it will be good to break your particles system into several one something like 3 hair particles, one for short, medium and long. Like this is you can investigate the problem your are having. It is always difficult to help without blend file.

Cycles hair rendering could use a much-needed efficiency upgrade. I hate it when a client requests something with hair or fur. The rendering times quickly go berserk when you’ve got a certain amount of strands and children. :neutral_face:

How do you know it needs 68GB?

Something to consider…if you have 1m hairs, and then have child hairs, each of those 1 million hairs will have children. I think the default value is like 100 child hairs, isn’t it? So…basically, times 1 million by 100 and that’s how many hairs you have. Generally, you’d have less actual hairs, and allow child hairs to fill in.

But generally speaking, rendering hairs in cycles is a very easy way to eat VRAM. I’m surprised that it renders well in Eevee though…if that’s viewport rendering, then it’s likely because child hairs don’t all display in viewport by default. (10% by default I think).

I use children as stated above.

@erickBlender I split it up into 3 particle systems. The blend is 50 MB. I tried to get a blend file but it isn’t working. It stayes a 50 MB
@Magnavis I know all that. What I don’t know is how many hair i have. He has 20 children. I painted the hair in by hand. It renders in Eevee, because it doesn’t calculate lightbounces.
@kkar I use children as stated above.

I tried to get a blend file but it isn’t working.

What is not working? If the problem is about the size then you can share the file from your google drive.

If you share a .blend file, users could help you better here.
If you don’t want to share your model, you could add a sphere to the scene and add the used particle systems to the sphere. Then delete the objects that you don’t want to share and carefully save the .blend file to share with another name so you don’t overwrite your work. It might be convenient for you to compress the .blend file, for example 7zip

Another thing to look at is making sure the Spatial Splits and Hair BVH checkboxes are turned off. Turning off the specialized BVH will mean a slower render, but some memory will be saved.

1 Like

They are turned off.

The link does not work.

Well what do you expect? I never did this before. That’s the link Google Drive gave me. What do you do to share blend files over Google Drive.

I do not use google drive.
How big is the file? If it is very big, you try to compress it to .rar or .7z and then you tell me here what is the size of the compressed file.

Try again dfsgrsfd sg

For each particle system, you try to reduce as much as possible the value of “Steps” in the “Render” > “Path” section (Particles tab).

Not sure why the number of particles is shown = 0. What is the history of your model? Is it a model that you made with previous versions of Blender? You are probably using a very high number of particles, and perhaps you should research about good practices in order to use fewer particles. You did not packed the textures in .blend file. People usually use this kind of textures in the model when they use hair particles in it:
https://www.google.com/search?q=fur+tileable+texture&source=lnms&tbm=isch&sa=X

Same kind of things happened to me (at a lower scale) I succeeded to cut down most of the ram usage by reducing the subdivisions in the particles settings , like Yafu mentioned . This stuff work like subsurf, each increment multiply the strand divisions by two and that can add up really quiclky.