Maybe this is interesting to push Cycles caustics rendering while maintaining rendering speed?
From the paper - it sounds like itâs an unbiased method. Would be good to have better caustic rendering in cycles. Was always a shame the MLT patch didnât make the cut.
Yeah, I also still regret that.
Lukas Stockner was also not quite as experienced with rendering algorithms (and with the design of Cycles as a whole) as he is now.
As for this new form of Monte-Carlo, it looks quite promising as it looks like it will work well with both OIDN and animation (as it does not appear to create low-frequency noise patterns in other areas). The same would be true for working well with adaptive sampling.
More than specific to caustics rendering, this paper looks like a slightly better sampling algorithm, doesnât it?
Yes. Although I canât judge the technical side of the paper, the overall impression of rendering efficiency is promising. I hope Brecht / Lukas see possibilities for a Cycles implementation.
Caustics aside, the glass looks a lot cleaner
Look at their test results here
https://langevin-mcmc.s3.us-east-2.amazonaws.com/interactive-viewer/index.html
Looks like everything is a lot cleaner
These types of papers are always exciting! Of course implementing them into a production renderer is always more difficult than it seems. Evidence of course of Lukasâ MLT patch. Without going too deep on it the method does seem somewhat similar in nature to MLT so it could possibly have the same downsides that MLT did: longer learning time for simple scenes, more inconsistent noise in animation and sudden âjumpsâ in cleaning up image artifacts. Just geussing but itâs a bit early to assume this is the silver bullet is all Iâm saying.
That said I think there are a few techniques that are âbattle provenâ in professional renderers that could benfit cycles. Path guiding in particular complements standard path tracing quite well. As well as fancy next event estimation techniques. I particularly like the implementation in RenderMan with the pxrunified integrator from a user experience standpoint. It basically acts as a standard path tracer but you can turn on manifold walk nee and specify which lights and surfaces to focus caustics on. https://rmanwiki.pixar.com/display/REN23/PxrUnified
Actually looking those renders I didnât like the noise quality. Of course theyâre better, cleaner images. But as you say, for the âbattle proofâ looks like the even noise of simple path tracing is more reliable. And Iâm also thinking about denoisers.
Does anyone know if OIDN (or Optix) would work out of the box or it should be re-trained for such a MLT-ish noise pattern?
Thatâs difficult to say, especially because the training data from both is not publically available.
OIDN (and I believe Optix) work fine on LuxCore with Path/MLT and Bidir/MLT, so the same should carry across to Cycles
When it comes to neural networks, speaking of âshouldâ often doesnât work. Neural networks can have very unexpected knowledge gaps. It may also fail miserably. The only way to know whether it works is to try it out.
Even without the actual data we donât know a whole lot and they refused to be coaxed earlier this year when I tried asking https://github.com/OpenImageDenoise/oidn/issues/51
- We donât know what renderers were used to train OIDN so far
- We donât know how many scenes were used or the content of those scenes (did they contain subsurface scattering, volumetrics, motion blur, transparency, caustics, hair, etc. etc.?)
- We donât know the sample counts or sampling patterns used when generating the test scenes (e.g. is denoising quality affected by using PMJ vs SOBOL?)
That it works so well in blender so far is pretty amazing.
Is the blender 2.91 windows build 's builds image viewer is broken? I am bot able to view any images in texture paint panel, uv editor panel or image viewer. I know it is under heavy development. Still
I have created an open dataset for Cycles and provide it on request, which happened 4-5 times (due to many changes this is currently on hold though). However, I donât know whether Intel or Nvidia requested it and are actually using it for the training.
I was reading this about Cycles + displacement memory optimization link
Just realized that Blender was computing all objects in my scene while they were hidden, they werenât excluded from render.
Any news if this will be fixed in future developments ?
If not, which render engine do you recommend using that allows microdisplacement to be rendered without memory problems?
Thx in advance
If you can afford it try the corona renderer. Since version 5 IIRC it has a 2.5D displacement which optimizes speed and memory usage.
Pretty much every other render engine does microdisplacement better than Cycles, but Iâve been using Octane lately and itâs insane how fast and memory friendly it is
The Adaptive Subdivision already does some optimization based on the camera view, using a lower dicing rate for object that are far or outside of the view.
Itâs probably object-based, so if your object is huge like the moon in your example, it sure isnât enough. But slicing the moon in several objects could potentially help already.
Cycles really needs better memory management with texture tile caching and smarter displacement for GPU rendering. Itâs crazy to run out of VRAM so quickly even with a 12GB video card.