EEVEE Development updates ( EEVEE-Next )

Are we finally getting good displacement in Eevee? I didn’t realize that was on the to-do list. Sweet!


real time viewport got submitted in master today


Realtime compositor is a very nice feature.
I think it is a shame that there is only one compositing node tree when it becomes so useful.
I would like to be able to switch between multiple composite node trees on the timeline like in VSE.

Now EEVEE Next can light them up.

No shadows yet.

Also fun with realtime compositor.

To be honest I didn’t use Blender compositor that much but compositing directly on the viewport is very fun and I will frequently use this from now on.


If there is screen space GI, are indirect light probes in Eevee next?

Are there any improvements made to the lighting?

You can literally read about it yourself:

1 Like

For this commit specifically.

Compared to the previous implementation this has a limit of 65536 lights
per scene. Lights exceeding this limit will be ignored.

This also introduce fine grained GPU light culling, making rendering
many lights in a scene more efficient as long they don’t overlap much.

For reference, the current EEVEE has a limit of 128 lights.


Filter, directional blur, bilateral blur, dispeckle, and bokeh image (NOT bokeh blur) nodes are now supported in realtime compositor.

Made this in the mood of upcoming September…


Realtime cryptomatte?! Yes please!


As I was reading it, I noticed:

I also had to disabled jittered Depth Of Field for the viewport. This is because it is too much unstable and incompatible with TAA which is what the viewport uses. The tool-tip should reflect that once EEVEE-Next replaces EEVEE.

I hope that somehow it will return. I, for one, absolutely love the over-blur effect using Eeevee’s DOF. I’d certainly miss it :frowning:

1 Like

Jittered DOF is relatively useless though, as none of the render passes include the over-blur.

1 Like

Longer term planning

  • Currently we have 3 bigger projects running that are related to each other.
    • Viewport compositing and needs Metal backend to run on Apple devices as both projects rely on compute shaders that aren’t supported on Apple/OpenGL.
    • EEVEE-Next needs viewport compositing for the Bloom effect.
  • Due to these reason they will be released at the same time. We don’t expect EEVEE-Next and Metal backend to be ready for Blender 3.4
  • Viewport compositor could be an experimental feature in Blender 3.4 on supported platforms.

Awww :frowning: oh well, at least they’re taking the time to get it right :slight_smile:


Lukas is working on improving library for calculating tangents that among others will speed up EEVEE.
The patch should land rather soon:
Let’s hope this patch lands soon:


And it is in now.
rB6951e8890ae3 (

To note, the primary way it speeds up Eevee is the optimizations it does to the process of initializing and then generating the normals (which will be noticeable during the activation of Eevee as well as when modeling or authoring shaders). It also means a slightly faster startup time for Cycles when the rendered view is selected or F12 is pressed.


Does this optimization make this faster normal map node setup obsolete?


Whats stopping you from testing it?



78k skeletal mesh (on RTX2060 and i7-9750)
2.93 solid view - 40 fps
2.93 eevee - 6.5 fps
2.93 eevee gpu normal - 38 fps
3.4 solid view - 40 fps
3.4 eevee - 22 fps
3.4 eevee gpu normal - 38 fps

Its significantly faster, but it still not possible to animate at 60fps and optimized normal map is still faster.
In comparison same exported (baked) animation with same model in Unity have 1300 fps.


For the 10,000th time it must be said: Comparing any DCC to a game engine in such matters is apples and oranges. Different tools, feature requirements and goals. Blender’s viewport is way faster than c4d’s viewport, and I suspect competitive to other DCCs.