Optimizing EEVEE GI

hi

Do you believe that triangulating the mesh optimizes the use of the EEVEE GI viewport in Blender 4.2?
Are there any palliative measures in this regard?

What about the procedural maps of the materials? Is it appropriate to bake them to optimize the use of the EEVEE GI?

Is there any improvement if we delete the back faces of an object since they will not appear in the image? Do the textures work well like this or could I have problems?

Thanks

If there is a difference, I would imagine it would be a small one.

Baking procedural materials will speed up any renderer, not just Eevee. This can make a big difference if the procedural material was complex, but it will take more memory to render, as you need to store the image.

It would reduce the memory load of the scene, but do little difference on render time (back faces are already ignored thanks to the backface culling feature).

1 Like

Thank you for your attention

I have more questions:

When baking procedural maps on an open mesh without the back, does the texture area of ​​the visible parts increase?

Thanks again

The available texture area does increase, but only if you redo the UVs and make them fill the newly liberated texture space after deleting the faces.

open objects also have the back in the bake?

Let me show you what I mean with some images.

Here, I start with a monkey. The object has UVs, which cover the texture space.

Now, I delete half the model. This leaves large holes in the UV space. Those holes are wasted texture space that is unused and won’t be baked.

I can pack the remaining UVs to better fit the space. This will allow the remaining parts of the model to cover more area of the texture, so they get better resolution and this will reduce the wasted texture space.

1 Like

I understood this part.

My question is:

When the object is open (without one of its faces) does the inside undergo the baking of procedural maps of the material?
In EEVEE GI can it be discarded from the render?

Thanks again

The inside of faces isn’t used in the baking process. In fact, if you wanted to bake or texture the inside of an object in any way (like, you are making the walls inside a room), you would need to invert the normals (the direction of the faces) and make sure they point inside the room.

If you have an object that has a texture, the back of a face will show the same thing as the front, but inverted.

yes, it’s the “backface culling” I mentionned earlier. You can find it in the material settings. It will discard the back faces, making them invisible. The direction of your object’s normals will become important if you use this, because if the surface is inverted, the object will look inside out.

1 Like

Is there any trick to using extremely dense meshes for use with EEVEE GI?

Is there any resource or addon that has the same function as Unreal Engine’s NANITE?

Thanks

There isn’t and cannot really be a direct nanite equivalent in Blender. Blender is a software that needs to support modifying meshes at any time. There is a reason nanite works on static meshes only, it relies on caching and pre-processing, so Blender couldn’t and shouldn’t do the same, as you would have to wait for this process to happen every time you change a mesh.

If you want performance in Eevee, you should plan your objects and scene so they have a reasonable polygon count.

  • Be careful not to set subdivision modifiers higher than you need. Subdivision is exponential and multiplies the polygon count by 4x for each level.

  • Don’t give small, background objects more details than they need if your project is going to be rendered inside Blender. Blender isn’t a game engine, so an image or movie project will have objects you will never see up close.

  • If you are making objects for a game engine however, your file will probably have lots of separate objects that aren’t arranged into a scene. In that case, you can use the collection system to hide the objects you aren’t working on.

  • If you are using dense objects like sculpts or photoscans, you should either decimate or remesh them. Bake their details if necessary.

Keep in mind that Eevee isn’t designed to be a true realtime renderer like Unreal engine and will never get the same performance. It’s more of a fast offline renderer that can also be used as a viewport if your settings and lighting setup aren’t too crazy.

2 Likes

to render real time EEVEE GI from the new blender 4.2:

what should we moderate for real time to work well? small textures? low density mesh? lighting with few lights? others?

thanks

Blender has the “material preview” mode. This mode actually uses Eevee and is set by default to have good performance. It achieves this by lighting the scene entirely with a preview HDRI, not using any light source that casts shadows.

preview

Also, If you have lots of polygons in the scene (multiple millions) or very complicated procedural materials, you will get slowdowns that will make Eevee barely useable as a viewport.

1 Like

Thanks for the help

But you didn’t understand the question:

It’s about rendering GI of the new EEVEE. How should I manage my creations to make good use of the GPU and Processor?

Thanks

If you are talking specifically about the new global illumination, it’s all about understanding what its settings do.

The new GI is actually composed of 2 different methods: screen tracing and fast Gi.

2_methods

Screen tracing is the higher quality and expensive method (the denoiser is there for this method, as it’s noisy). Fast Gi is a faster, usually less-noisy method that’s a bit less realistic looking.

Which one of the 2 methods is used is controlled by the “max roughness” slider. Any material that’s less rough than that value uses screen tracing and any material that’s rougher uses fast Gi. That’s because screen tracing really benefits reflections, but tends to be noisier on rough surfaces. Fast Gi works best on rough surfaces, so it can take care of those.

If you put the max roughness at 1, you will use only screen tracing, getting the high quality method everywhere, but your image will become noisy and take many samples to render, almost like a mini Cycles.

If you put the max roughness at 0, you will use only fast GI. In that case, your image will render very fast and with little noise, but you lose a bit of detail, especially in the reflections.

Fast GI has itself 2 methods you can choose: Global illumination and Ambient occlusion. This allows you to turn the global illumination off and get a faster render, similar to old Eevee. If you do this, you would need to use an other way to do bounce light, either use the “volume” light probe, or fake it manually with lights.


The exact settings you need will depend on what’s in your scene. An exterior scene with bright day light will react very differently than a dark scene with lots of small emissive objects (The GI really doesn’t like those, makes lots of noise).

Honestly, the default settings are pretty well optimized for most scenes. Most changes you could do would increase quality (like turning off denoising, increasing precision, using 1:1 resolution).

The main thing to know is to set the fast GI to ambient occlusion to get a faster render that’s more like old Eevee, especially in scenes with lots of emissive materials that would otherwise make noise.

4 Likes

Do you believe that the real-time performance of Unreal Engine is much better than the EEVEE GI of Blender 4.2 in the viewport?

Thanks

Certainly.

Epic is a multimillion dollar company with a massive amount of resources.

If you want the best real-time renderer in the world, go with unreal.

If you actually want to make stuff to go into unreal, then a DCC like blender is probably a better choice.

5 Likes

Also, the new version of Eevee is very new and is surely not the final product it will one day be.

But, keep in mind it isn’t designed to run at 60 FPS like a game engine. You can use it in the viewport, but it’s more for preview purposes than for realtime use. Many effects in Eevee are meant to use multiple samples to completely render and have their noise cleared.

In my opinion, this is fine. I can work with a slightly slower than realtime performance for preview purposes. Do you have a use case where you really need Eevee to be realtime?

My video card is the RTX 2060 6Gbs

I don’t want to do animation or any other kind of movement.

I make static scenes with the camera always still
and I want to edit materials only with procedural maps with the viewport in real-time in EEVEE all the time. And use very dense meshes to use the very fine displace in the geometry node to give texture to the meshes.

Do you think the viewport will be too heavy?

The image was rendered in Unreal Engine - Lumem:

Thank you

1 Like

What you just described here is about the worst case scenario for viewport speed. Just calculating the complex procedural textures alone will be slow, it’s not even a matter of renderer or render settings. And we are not even talking about the displacement.

My suggestions for improving performance would be in your texturing workflow, not in the render settings.

  • Bake any material that’s done to image textures as soon as possible so you don’t have multiple procedural setups being rendered at the same time in the scene. If you have a single very heavy material that can’t even display on its own, you might even benefit from baking parts of the material that are done and keep building the material on top of that, or replace parts of the shader with image textures if possible.

  • I have to question the use of displacement here. Most materials in this image look like I wouldn’t notice the difference between bump and true displacement. If you really do need it, maybe test the material on a smaller section of mesh with fewer polygons? Or make the material using bump (rather than true displacement) on a lower resolution mesh, you can always try the same material with displacement later once your bump looks good.

3 Likes

hello

Does this white space, without object structure, in bake affect the processing of EEVEE GI?

Thanks