@Bruno path guiding helps to find more difficult light transport paths, which are unlikely to be explored using the normal sampling/exploration techniques of a path tracer.
Such light transport paths can be: complex indirect illumination, light contributions from longer paths, and yes sometimes even caustics.
As I already mentioned, path guiding is not focused on caustics, it is a side effect that you can render caustics with it.
Regarding caustics, you should also keep in mind that other algorithms such as light tracing, bi-directional path tracing (used by luxcore), or VCM can still outperform a path tracer that uses path guiding easily in some cases.
I guess in this path guiding patch is no light tracing included?I have looked into all pdf from your SIGGRAPH link.
If no light tracing is in the patch,light tracing could improve the rendering even more?
Are there plans then to implement further patches after PG, like light tracing into Cycles?
Or do these methods (light tracing) are not fit with a pathtracer like Cycles?
Yes, but we need to wait until Open PGL is also part of the Linux dependency build.
If I understand @LazyDodo correctly he was only able to include it in the Windows build before he went on vacation.
@SteffenD I will update you here as soon as a Linux build is ready.
My understanding is light tracing / photon mapping would make sense to implement if Cycles was bi-directional. But making Cycles bi-directional would be a lot of work and the flexibility tradeoff would not be acceptable (no more light path node, ā¦)
This is a bit old but i think it still applies. @lukasstockner97 correct me if iām wrong.
It would still be interesting to know what could be promising next steps for Cycles. Lukas, path guiding was actually in the closing remarks of your Cycles talk 2 years ago ! Weāve come full circle
@makizar
The last post pretty much brought it to the point.
IF Cycles ever plans to implement any bi-directional method, it will probably
come with the catch that many nice features (filter glossy, shadow rays, shaders depending on previous light path expressions (Glossy/Diffuse), ā¦) artists use have to go away to make
it work.
@pixelgrip I know what the people who worked on that image had to do to make the combination of light tracing and path tracing work in their production renderer and this was not easy.
I remember it being cited as the main reason it will not happen (by Brecht, iirc). The flexibility in Cycles is a godsend, itās hard to consider ditching it. But what if it works with a subset of nodes and it is well communicated, I would certainly use it
Current LuxCoreRender has various caustics solutions which you can utilize by just installing latest LuxCoreRender addon. Questions will be answered on the LuxCoreRender Forum.
With this new path guiding branch Cycles is closing the gap to LuxCore Render regarding caustics and indirect lighting but cannot compete yet with bidirectional pathtracing with SDS-cache or Optix RTX rendering with indirect cache.
But it is a very promising developement that is yielding stunning improvements even in this initial stage and will hopefully in the future make Cycles a more versatile solution for difficult rendering tasks.
If this can be extended to glossy surfaces from current diffuse only it will get ahead of the competition for sure!
I just noticed that Parrelax-Aware VMM might actually be the best one for long renders. I noticed a major weakness in the Directional Quad Tree algorithm in that it brings about a steady performance drain the longer the render goes (followed by a crash at some point once enough samples are on screen). It might not bring caustics quite as quickly in cases, but it seems to be the more reliable option.
I actually had this with PAVMM too I think
I canāt say for sure: It was a hard crash and happened while I wasnāt looking. All I observed was, that Blender was suddenly closed.
Happened three times thus far, each time on an attempt at a longer render
That said, this was all with the first build. Might be the current build doesnāt have this issue
@kram10321 and @Ace_Dragon could you share a little bit more details on the setup?
What is your definition of long renders (i.e., how many samples were you using)?
Iām sorry, I canāt say too much of value here.
I tried rendering a slightly modified version of the pool scene for several hours but when I checked back on progress after leaving for a bit, Blender was closed. Thatās all I got.
I can tell you in principle it was a few thousand samples but I have no clue how early or late it happened so thatād still be a huge range.
Additional info that might help somewhat:
I got 8GB of RAM (though I didnāt notice the memory usage expanding much at all so I donāt think that was an issue)
My CPU is an i5-8400 with six cores and three cashes:
L1 384 KB
L2 1.5 MB
L3 9 MB
Maybe those cashes would fill up or something? If itās a memory issue, Iād suspect it to lie there
My Graphics Card for good measure, though I doubt thatās the issue since itās a CPU-render:
NVIDIA GeForce GTX 1660 SUPER
6 GB dedicated, 4GB shared memory for a total of 10
If I can determine something more accurate or consistent, Iāll be sure to share. What ever it is, it seems to happen rarely though.
For some scenes it makes a dramatic difference - but for others, the scene actually looks worse with it than without (e.g. this one below).
This is just straight path tracing with no guiding - but when I tried path guiding, the scene was actually noisier and slower to render for the same number of samples ~500). Unfortunately, I forgot to save the path guided render before closing.
I actually havenāt seen any improvement at all. Not talking about caustics as I thought this was supposed to improve long gi paths mainly.
Iāve had darker gi in corners using guided pt than regular path tracing when i put object in a box in a separate room from where light is coming from.
But I guess itās too soon to test and compare yet.
It is still very early indeed, since the Path Guiding branch is expected to be worked on for a least a few months before a patch for master is ready.
That does not mean scenes that perform badly shouldnāt be submitted (if it is not too big and complex at least), as one goal is to have the worst-case scenario being at least equal quality to it being off.
Yes, the algorithm in its early state can be somewhat hit and miss, but I have found that when the speedup in terms of some caustic details already is literally exponential. I anticipate that quite a bit more will be captured quickly as the branch is worked on.