Developer Meeting Notes

You still need the probes to mask some of SSGI’s problems, so it’s still going to be slow, awkward to set up, and prone to a number of new artifacts in addition to the old ones.

Look, I’m not saying it’s going to be entirely useless, and I guess it has the saving grace of being easy and fast to get meh results. But there’s no path forward with SSGI. You can’t invest more computer time or even artist time for better results. The problems with it won’t be solved in the future. It’s a dead end.

None of them are perfect, but almost all are better than SSGI, so pick your poison:

In addition there’s some quite old techniques like virtual point lights or irradiance caching that aren’t really realtime but would still fit the bill for Eevee. Well, they wouldn’t be great, but still better than SSGI.

1 Like

From the eevees-future article

Hardware Ray-Tracing

Supporting SSGI brings ray-tracing support a step closer. The new architecture will make the addition of hardware ray-tracing much easier in the future. Hardware ray-tracing will fix most limitations of screen space techniques, improve refraction, shadows…

This seems to be the plan, using SSIG as a first step and enhancing it with hardware ray-tracing.


But there’s no path forward with SSGI. You can’t invest more computer time or even artist time for better results. The problems with it won’t be solved in the future. It’s a dead end.

Not really interested in getting into a discussion here, but for example:


This is not SSGI though. It still uses world space where screen space fails. This is a fine technique but it is not what is on the table here. Moreover, the screen tracing here is just for performance. Eevee doesn’t need that much performance.

It literally is though? You can start a ray in screen space and continue with whatever fallback you want, currently against simplified scene representations in most hybrid renderers. So the hits that land in screen space and use a conservative enough thickness against depth give correct results and the ones that don’t can be continued with raytracing for example against a simplified scene. The starting point and the overall architecture fitting for it is the same.

Same thing but with ray traced reflections:

(Also a lot of the new changes are similar to this paper)

1 Like

I’d rather not get into an argument over semantics… This is proper worldspace raytracing that is supplemented by SSGI for performance. It exhibits very few of the problems of SSGI. You could remove the SSGI entirely here and still get a proper result albeit after a slightly longer wait. Too long for a game, but fine for Eevee.

That said, I’m all for this if this is what gets in Eevee.

I’m not interested in a semantics debate either. I’m just pointing out that starting whatever tracing against a g-buffer is very common for hybrid renderers and it’s not restricted to stay in screen space.

1 Like

The Eevee SSGI thread clearly shows there is a lot of demand for the feature, and examples do show that it is able to bring good results where the current GI is insufficient.

Now if Eevee was capable of generating SSGI from virtual camera angles and stitching the result into the rendered view (an imaginary 360 degree camera for instance), or being able to utilize any kind of volume probe illumination as an assistive tool, then things could become interesting. I do not know if such a thing would have to wait for Vulkan first.


That’s what I asked some time ago: render the indirect diffuse at 360° from camera and reproject to the correct final view. Not all but many some limitations of SSGI would be cleared, with almost full SSGI speed

1 Like

There’s no mention of SDFGI (as in Godot 4) in the list.
And I’m also curious about this unspecified technique:


I believe Godot’s SDFGI is something Juan cooked up by himself, I don’t think he ever published a paper on that so of course it wouldn’t be there. Also that list isn’t exactly new.

Godot is also MIT licensed, so the devs. can look at the algorithm and implement something similar tailored to Eevee’s requirements (ie. scaled up because it does not need to run at 60 frames per second).

The sticking point is that it would require Eevee to move to Vulkan first, so it might be a while.

1 Like

I guess that move is what is told between the lines in the code.blender Eevee post. Even if it’s strange that the word ‘Vulkan’ is not mentioned…

1 Like

I don’t know, but afaik a SDF GI was/is in Unreal 4, deprecated since the introduction of realtime raytracing.

Maybe there was, but I don’t believe Juan’s implementation is based on that? Not sure. Though the principle may be similar.

1 Like

as far as I understood it (take it with many grains of salt because i didn’t investigate too much) Godot SDF are used to produce a “simplified” world used by light probes to light the scene and also provide (coarse) glossy reflections; in Lumen the SDF world reproduction gets raytraced, for a more accurate (but costly) result. Also in Lumen this approach is combined with two other techniques for small scale details (SSGI) and large distances (voxel or height field GI).
The main difference, pertaining SDF, seems to be that Godot relies on probes while Lumen on RTX technology. Both have their reasons. From a Blender PoV both could be good. Maybe the perfect match could be, to use Godot like SDFGI for realtime (we already have probes in) and then “fall-forward” to raytracing (not necessarily vendor-constrained, embree anyone?) for Eevee final renderings

07 June 2021

Notes for weekly communication of ongoing projects and modules.


  • Christoph Lendenfeld commit access to work on Animation & Rigging.
  • Peter Kim grant recipient to work part time on OpenXR/VR.
  • Pratik Borhade grant recipient to work in the bug tracker.


Modules & Projects

Asset Browser

  • A four day workshop took place (remotely) to work on the asset system design and plan the 3.0 targets.
    • A number of stakeholders with various backgrounds were involved (Blender Studio, BlenderKit, PyClone, sculpt module).
    • The workshop helped making significant progress in design & planning.
    • Outcomes will be presented soon.

New Features and Changes

Edit Mesh Performance

  • Use partial updates for normal and face tessellation (commit , commit ) (Campbell Barton)
  • Refactor extraction of mesh data for drawing (commit ) (Jeroen Bakker)
  • Remove checks for tessellating two-sided faces (commit ) (Campbell Barton)

Exact Boolean Performance

  • Improve boolean performance by parallelizing a plane calculation (commit ) (Erik Abrahamsson)
  • Speedup when there are many separate components (commit , commit ) (Howard Trickey)
  • Avoiding some memory allocation and freeing (commit ) (Erick Abrahammson)

Geometry Nodes

  • Add Delete Geometry Node (commit ) (Wannes Malfait)
    • Support curve data (commit ) (Hans Goudey)
  • Add Curve Length Node (commit ) (Johnny Matthews)
  • Transform Node: Improve performance by skipping mesh normal recalculation (commit ) (Hans Goudey)
  • Add Multiply Add to Vector Math nodes (commit ) (Charlie Jolly)

Grease Pencil

  • Add new operator to normalize stroke data (commit ) (Antonio Vazquez)
  • User interface tweaks
    • Show pressure curve widgets in the sidebar (commit ) (Philipp Oeser)
    • Tweaks to icons (commit , commit ) (Antonio Vazquez)

User Interface

  • Add an overlay for flashing with the transfer mode operator (commit ) (Pablo Dobarro)
  • Keymap tweaks
    • Only use the tablet drag threshold for mouse button events (commit ) (Campbell Barton)
    • Use D-Key for view-pie menu (commit ) (Campbell Barton)
    • Support running transfer mode in any object mode (commit ) (Campbell Barton)

Linux Wayland Support

  • Support window decorations (commit ) (Christian Rauch)
  • Adapt window and cursor surface scale to support HiDPI screens (commit) (Christian Rauch)
  • Retrieve system cursor settings (commit ) (Christian Rauch)


  • Optimize 3D viewport rendering with camera passepartout (commit ) (Brecht Van Lommel)


  • Initial support for full-frame based system for reduced memory usage (commit ) (Manuel Castilla)

Video Sequence Editor

  • Add “refresh all” operator to all sequencer regions (commit ) (Richard Antalik)
  • Update FFmpeg proxy settings (commit ) (Richard Antalik)
  • Display source video fps in the VSE (commit ) (Sebastian Parborg)
  • Remove JPEG reference from proxy panel (commit ) (Richard Antalik)


  • Update Camera presets (commit ) (Fynn Grotehans)
  • Library Overrides: Add override_hierarchy_create to ID’s RNA API. (commit ) (Bastien Montagne)
  • Limit Rotation Constraint
    • Add an Euler Order option (commit ) (Alexander Gavrilov)
    • Explicitly orthogonalize the matrix before processing to negate shear (commit ) (Alexander Gavrilov)

Weekly Reports


The latest blender rendering meeting, further confirms 3.0 to take a longer release cycle:

With the Blender 3.0 release becoming a longer release cycle, there’s a chance Cycles X will be part of that instead of 3.1. That also means point cloud rendering will likely be committed to the cycles-x branch instead of master.


One of the big things to note for the next couple of weeks.

  • Cycles X: shader ray tracing and (incomplete) baking support was added back by Brecht. Next is initial homogenous volume rendering support. Sergey works on render passes, for shadow catcher compositing and denoising of multiple render passes.

This means that I will be able to try out many of my scenes in Cycles X (which means it is a big step toward me being able to use Cycles X for most work instead of Cycles master).

One open question is how many weeks to go before the new render engineer starts helping Brecht and Sergey (which the meeting does not say).


My guess, until he familiarize himself with a few thousands lines of code scattered in a few hundred cycles files all over blender repo. (unless he already did, in which case, idk…)