Negative
Thin films are different from thin surfaces, @lukasstockner97 already clarified this above.
Negative
Thin films are different from thin surfaces, @lukasstockner97 already clarified this above.
Note that this is also on the list of things that could be added, but under the name Iridescence.
Oooo Iridescence would be cool as heck!
Iād hope for that to be a separate shader though
In Lux Renderer stuff like this is handled with a mix shader of sorts that, however, has a layer thickness instead of a blend factor, so you can coat any material with any other material of a given coating thickness. Something like that would be really cool to have in Cycles
(also I updated my post above to show what I think the checkbox would do - is that about right?)
The idea would be to both have it included as part of the Principled BSDF as well as a separate node.
Same goes for the new Sheen and Metallic components, by the way.
Regarding the post above, itās a bit more complex - with the thin surface as described by Disney and SPI in their talks, you never have actual refraction, you just have a reflection-like component at the other side of the material, because the air->material and material->air transition are both handled in the same bounce. Thereās some additional logic to match refraction and to account for multiple bounces inside the material (e.g. refract in, reflect off the virtual back side, refract back out of the front).
Ah very cool, so stuff you canāt even really do right now as far as Iām aware
(Maybe with an OSL shader)
Isnāt Iridescence best done with spectral? I remember they said they wanted to support thin film interference in the spectral branch after the main bits are in master.
Also Octane is a spectral renderer as well.
it can be done without but it ought to be fairly straight forward to do it āproperlyā in the spectral branch as well
The entire point of the Belcour paper is how to efficiently compute an accurate Fresnel term in RGB spaces without needing to do proper spectral rendering
Yeah I know, there are some impressive results out there using only three channel rendering. The spectral branch is still in the works though, and I think those RGB based techniques have an issue with keeping that spectral data around where it might be necessary. And if you render with spectra directly anyway, this stuff ought to come pretty much for free (or exactly as hard or easy as caustics generally are)
Like, for instance, if you do a double prism setup, it ought to be possible to first split white light into its spectrum and then recombine that spectrum back into white light. Iām not sure an RGB-based method could quite pull that off.
Maybe it totally works out, I donāt actually know
Yeah, absolutely, spectral rendering is still a very interesting topic. Just saying that we donāt need it for iridescence in particular.
Same with complex IOR - both iridescence and complex IOR are already (kinda) possible in regular Cycles using elaborate math nodes to derive the correct colors. Iāve seen and used such nodes before and they are kind of a nightmare.
Iāve also built an iridescence shader and a complex IOR shader in an old version of the spectral branch (predates Cycles X) and itās actually easier because you donāt have to juggle and carefully attenuate the separate channels. Instead, you just go āfor each wavelength, do thisā
The two implementations of Complex IoR with RGB and spectrally were compared too, and honestly the results, if you pick the exact right colors in RGB, were remarkably close.
Though itās really cool that you can just plug in real measured spectra and know that the end result is basically perfect. If you are going for real materials, thatās extremely useful imo.
Iridescence:
And complex IOR
Both of those have equivalent three channel nightmare node setups
Rebuilding cycles into a spectral renderer is a much bigger task. Sure, once all of that work has been done, it makes things just work for āfreeā, but thereās a big cost on the front end.
Only if you go through the trouble of decomposing colors and IORs yourself
Not quite sure what you mean but if you are saying youād have to manually specify a different IOR for each wavelength, that can actually be really easy to do in a spectral renderer: You simply have a curve that represents the spectrum. And if you are aiming at a measured material, you can directly take the data from that measurement and import it.
I mean we basically already had something working. It wasnāt up to snuff, but to get a better thing in, the first groundwork is being laid already:
https://developer.blender.org/D15276
This is a tiny patch but itās a baby step towards Hero Wavelength based spectral sampling. Pembem said there would be two more commits soon, one of which is going to be much bigger.
All new features take time though, thatās kind of a given.
I wonder, how is this task related to spectral rendering ? it is so same functions can be called to process spectral data vs RGB ?
yeah, basically, because these functions are overloaded, you can simply reuse all these data types building new functions that expect spectra instead of RGB values, instead of duplicating tons and tons of code having different versions for RGB values and spectra.
In principle, if anything of the sort ever comes up, it could also help with stuff like multichannel outputs (where you have more than three color channels) or what not. - Not gonna be super useful to most, but some scientific applications come to mind. Or some fancy, currently mostly research-level tech with screens that have like five lights or what not.
But those are speculative and may never come to Blender. Who knows. Mainly this is for spectra right now
I think that is good challenge to add to the manual-online (expandable), or pdf maybe (as additional version with this expandable part links) the part/section of this documentations as āThe best related tutorialsā for each advanced topics in maunal. Then user can grab related knowledge around the functions. I often learning due meet some problems and searching, but often loosing many time to find right course that can explain all not only typical solutions.
Not everything on the internet is captured, not everything is captured as often as you wish it was. For example a branch of a small organization asked the other branch in charge of updating the website to promote an upcoming event. They didnāt change the website until the day before the event (WHICH HONESTLY DOES NOT MATTER BECAUSE VERY FEW PEOPLE ACTUALLY VISIT THE WEBSITE) and lied about it. I tried to use the wayback machine to prove when the home page changed to mention the event but it had been months since the last time the wayback machine took a snapshot.
I think the second one is already in as well:
https://developer.blender.org/D15318
Now just waiting for the big one
FYI, Iāve just pushed the initial version of the āPrincipled v2ā branch and the associated developer.blender.org task where Iāll be tracking the current state.
Just to be safe, Iāll copy-paste the note from the top there:
This is an ongoing project, and nothing is 100% certain yet. Additional components/options might be added, existing components/options might be removed and the behavior of existing options might change.
Also, Principled v2 is just a placeholder name until someone comes up with something better.
Iāll try to get the buildbot to build the branch in case someone wants to try it out, but note that many things are still broken and/or unfinished (see the task for details). Thereās also a feedback thread on devtalk.blender.org.