Indeed, it looks like the fix focused on use of lamps and is neglecting case of Sun Disc from Sky texture.
If you replace Sun from Sky Texture by a Sun lamp, problem should disappear.
Mesh lights seems to be problematic, too.
Tizian Zeltner is happy to help implement his 2022 SIGGRAPH Paper âPractical Multiple-Scattering Sheen Using Linearly Transformed Cosinesâ into the new principle BSDF V2!
Jeffrey works on fixing bug with sun light sampling. Some discussion on best solution and practical implementation issues.
Brecht would still like to do multi-threading and instancing optimizations.
AMD HIP:
Linux RDNA2 crash bug was confirmed by AMD team and is being investigated.
HIP-RT support is still being worked on. Code should be submitted for review even if driver is not out yet.
Apple Metal:
Tuning will be submitted still for the latest kernel changes.
Support for NanoVDB is also expected to land still.
OptiX OSL
Patrick still plans improvements to better match noise functions with CPU.
Brecht also suggested to look at supporting texture file path loading through SVM, even if that would not support all the same filtering options as OIIO.
Other
Sebastian found discrepancies between lights that use ray visibility to make them not affect diffuse or glossy surfaces. A patch to address this was submitted, Brecht will review. Some discussion on best approach for this and performance impact.
Weizhen has been working on integrating a new Hair BSDF. The biggest challenge right now is for elliptical shaped hair fibers, which raises questions about how best to raytrace curves with such shapes (if at all), how to compute the normal direction for the ellipse, and how to deal with Catmull-Rom not being C2 continuous.
Patrick made progress on resumable rendering and denoising, with ability to store and load EXR caches for an animation sequence. Next step is loading cache for previous frame, which requires some work as this denoising code path does not support it yet.
Lukas continues work on completing the Principled BSDF v2. The plan is to start merging individual parts, like new multiscatter GGX, new sheen closures in the Velvet BSDF, and then at the end put it all together in the Principled BSDF.
Christophe Hery mentioned the Subsurface IOR patch that should also become part of this. This is a separate IOR than specular since skin can have multiple layers and so this IOR can be considered to be for another layer.
Practical Info
This is a weekly video chat meeting for planning and discussion of Blender rendering development. Any contributor (developer, UI/UX designer, writer, âŚ) working on rendering in Blender is welcome to join and add proposed items to the agenda.
For users and other interested parties, we ask to read the meeting notes instead so that the meeting can remain focused.
Next Meeting: January 24 2023, 5 to 6 PM Amsterdam Time (Your local time: [date=2023-01-24 time=17:00:00 timezone=âEurope/Berlinâ]â 2023-01-24T17:00:00Z)
Just a quick note - a new Path Guiding branch build has dropped (cheers Sebastian) which now includes techniques that have glossy guiding as well as diffuse.
Some tests are being posted here - but early results look promising.
edit - the algorithm has been updated to include a user editable glossy threshold, which should improve guiding for glossy materials with lower levels of roughness.
For some scenes, CPU with path guiding converges much faster than brute forcing it with GPU - so even without GPU support currently, path guiding is still of benefit.
But I agree - path guiding on GPU will be a welcome addition, although it may take a while because I think the intent is to get various ray types guiding in a stable manner before trying to port it to GPU.
The main problem is, and I guess everyone can easily test that, that even without path guiding GPU is a lot faster. Thanks to modern denoising.
So if Octane and Redshift can do it, its just a matter of time end effort.
On the other side, Cycles development has tons of work to do, but seems to lack man power. I donât know why the BSDF 2 branch isnât progressing since a half year.
YupâŚthatâs the one that Iâm looking forward to the most. When I heard the presentation at the Blender conference I thought that this was on a fast track to be implementedâŚbut unfortunately it doesnât seem to be, nor is it really mentioned in the 2023 roadmap.
IMHO, getting Cycles to produce more realistic renders should be paramount to anything else. The âlookâ of cycles is a massive problem with Blender IMHO.
PleaseâŚstop. Youâre giving me nightmares. Mental Ray in 3dsMax was the first proper raytracer I ever learned/used. There was such a crazy amount of fiddling and hacking with renderers back then.
I do not know why people keep bringing up the âCycles lookâ (that is unless they are only using the stock filmic settings with the Principled Node).
What I use instead is my own node groups made using the building blocks along with the new AgX, not using Principled already makes a difference, but my work now has another layer of realism with the new Tonemapping. Also do not forget the BFâs rich tradition of having bad default values (ie. min bounces being 0 when it should be 2 or higher, distance value in the bump node being 1 when it should be 0.1 or lower, default albedo values of 0.8 which is actually considered bright ectâŚ).
I always use sRGB linear output. I do that since 20 years and see no point yet to switch. ACES is an improvement focusing on red tones, for better skin color grading. Its not really important for 3d and unless you render 8 bit color outputs, there is no need to do that before compositing.
Its partly used in Advertisement so, especially if you need to follow very precise coloring guidelines. And also because the Redshift integration is so smooth and deep, that a lot people simple did that step.
Even the AI-based denoising benefits greatly if it has a more converged image to work with. I am sure that you could potentially get something passable with an insanely undersampled image with the right data in the passes (mainly with Optix as it often guesses higher brightness levels than OIDN), but you will never get the more subtle and/or smaller details that would bring it well above the imagery that made use of photons with final gathering.
Slowly but surely it has gained momentum and is now standardised in programs like Maya/Houdini/Mari/Substance tools. I suppose ACEScg is just the next stage on from sRGB.
I would prefer to see what the render output looks like directly in the buffer. Same applies for authoring textures.
As for âswitching to ACESâ, the way I see it is that there isnât much to switch. The workflow is pretty much the same.