What algorithm does the Wavelength node use?

Hi. My name is Sam, and I’m new to Blender Artists. You can call me by the name of my mascot, Sunshade (a bipedal green lizard which I haven’t posted any images of yet).

I’m working on a project where I create virtual renderings of various real materials (in Cycles) using their complex refractive indexes at visible wavelengths. (see refractiveindex.info)

Rendering realistic metals (i.e. based on real optical data, not just colored yellow to look like gold, for example) has been discussed on this site before. The main problem with this kind of thing is that Cycles is an RGB renderer, not a spectral renderer. So, in order to solve this problem and accurately render real metals, the optical spectral data for the metals will have to be converted to RGB colors.

Thankfully, Blender already has an algorithm for this kind of thing: the Wavelength node in Cycles. I am using this node to write a Python script to calculate the RGB color of a metal from its visible (380-780 nm) reflectivity function.

To do this, however, I will need to understand the algorithm behind the Wavelength function. Could you please go into detail about the math and calculations that are used to convert from wavelength to RGB? I know it’s open source, but the code isn’t very comprehensible to me because I’m not very familiar with the language it’s written in.

Thank you.

-Sunshade (Sam)

You’ll need to multiply the spectrum wth the CIE 1931 2-degree XYZ color matching functions, integrate over the resulting functions to get the CIE XYZ values corresponding to the spectrum and then multiply these with the appropriate conversion matrix (most likely you’ll want to use sRGB D65). The only tricky part here is normalization.

That being said - you most likely won’t need spectral representation. Why? Well, Cycles is missing two things for correct reflection colors on conductors - the conductive fresnel term and accurate fresnel computation based on the microfacet half vector. You can implement the first one using nodes, but the second one isn’t going to happen unless you directly change Cycles’ code, and I’m fairly sure that the inaccuracy that is caused by that outweighs the difference that rendering in a RGB colorspace instead of a spectral representation causes.

Sorry, but I don’t think that any amount of Python is going to give you exact metals in Cycles.

I did exactly that (in OSL) to see what the difference would be. In my opinion it’s negligible, but see here:

And in addition to what Lukas already mentioned, cycles’ lamps don’t have a spectrum either, so if you step away from white light, there is no way cycles knows what the spectrum of the colored light is (that is, it’s already projected onto RGB values).

Edit:
You can even optimize the values for the complex refractive index to match the spectral version as close as possible, which makes the difference even less visible. See here:

Cheers

Thanks a lot for the help! I will do some experimentation with refractive, color-matching, and matrix functions to try to get something realistic.

But I want to say that I’m not trying to render exact metals, only very close approximations of real metals. Rendering metals exactly how they appear would be impossible anyway, for two reasons.

First, a computer screen cannot display all the colors of human vision, only some of them.

Second, everyone’s cone cells are a little bit different, so the same wavelength of light will be perceived slightly differently by two different people. (The CIE color-matching functions are averages of the spectral sensitivities of a bunch of people.)

But I’m OK with a few slight, inevitable deviations from reality. I’m just trying to get something very close to reality, close enough that the deviations are hardly noticeable. Further, I’m rendering perfectly smooth metals (roughness set to 0.0), which should hopefully reduce a lot of the deviations resulting from Fresnel scattering between microfacets.

I thought it would be very helpful to you guys for me to show you the setup I’m using to render metals and other materials. It’s a few objects in a small room with moderate white lighting, with the light source placed high up on a wall near a corner, and a disco ball hanging from the ceiling. This is a rendering with a generic blue metal:

(If you look closely in the reflection, you can see the layout of the room that I described.)

I really appreciate your thorough, helpful replies.

-Sunshade

OK, update guys.

I decided to ditch the idea of writing a spectrum-to-RGB script. Because seeing the results of prutser’s experiments, the difference between an RGB render and a spectral one really is negligible. It is very hard to see, even if the two renders are placed side-by-side. Furthermore, since the peak sensitivity of human cone cells varies by tens of nanometers between individuals, I imagine that the differences in color perception from person to person would be greater than the differences between RGB and spectral renders. I decided that all the difficulties I had been having with writing a spectrum-to-RGB script really weren’t worth that level of precision.

So, instead, I used prutser’s “substrate_only” OSL script for my project (crediting him, of course.) For the n and k values of the three color channels, I used the wavelengths 645 nm (for red), 510 nm (for blue), and 440 nm (for green). These values are taken from this online converter. I rendered several elements using this method, and almost all of them look very, very close!

I will be posting a link to these renders on my Google Drive, as soon as I compile a spreadsheet of the data I used. Hope you enjoy my work, but until then, stay tuned!

Thanks to prutser for writing the script that I am using, without which this project would have been a lot more difficult.

-Sunshade

Hey, nice you got it to work! Keep in mind that there’s a node version of the script available here, which means GPU rendering is available. And for what it’s worth: I also optimized many of the metals you have listed to be as close to a spectral rendition as possible, when illuminated with white light. The blend file available in the linked post contains all of them, and many more materials.

Cheers!

Hey guys, I’ve actually created a new node group that greatly speeds up the interference/metal calculations. I was going to post it, but I realized there was a problem when simulating specific materials (thin film over a slightly absorbing glass). I think I’ve found/solved the problem and will be trying to finish it this week, so look forward to that, I guess!

4 Likes

more speed more better :smiley:

I didn’t mention this earlier, but a while ago, I noticed the wavelength node in Blender didn’t seem to give the correct colors (while also exploring a way to render spectral colors). I was able to find some functions that were fit to the CIE 1931 color matching functions in this article: Click!

It’s implemented in this Blender file: Click!

It should be using the sRGB D65 color space with a D65 illuminant.

2 Likes

Thank you for your work on spectral colors, Jett. I really appreciate the contributions that you and other Blender Artists have given to this project, it means so much to me.

I think it would be enormously helpful to share my blend file, so that other people can make improvements to it without me having to manually copy and paste their nodes into it (and risk making mistakes that could screw stuff up).

metal_color.blend (1.4 MB)

I already have prutser’s OSL script wired into the rendering setup. Unfortunately, I was out of town during these past days, so I have been unable to add in the updates other people have made to the rendering system in the past few weeks. Because I am starting school very soon and likely won’t have a lot of time on my hands to work on this project myself, I will leave it to other Blender Artists to implement those changes. I set the optical constants to random values for fun, but you can adjust them to simulate real materials.

~ Sunshade

1 Like

Soooo… any news on that new node setup? :innocent:

I actually posted it just two days ago! I thought at least everyone who replied to the thread would have gotten a notification/email when I posted a new comment (that’s what happened with me and this thread)…

Anyway, I think I actually just solved generalizing the equations for lossy incident media and might be posting v2.1 soon! v2.1 actually speeds up the calculations even more! And due to the change, some materials with lossy films will look slightly different (namely the space helmet). There will also be a new output for the effective IOR of the substrate to plug into the refractive shader; it’s really fun to play with!

The next update will probably be another fix to transmission because the current calculations don’t seem to account for absorption in the substrate (it’s supposed to be R+T<1, but it’s R+T=1 at the moment). And eventually, I want to be able to have separate roughness settings for separate layers.

2 Likes

Ah, after the transition to the new board my topics were on ‘tracking’ instead of ‘watching’. Therefore I hadn’t gotten a notification, so thanks for that! :slight_smile:

Edit: why would the v2.1 make the materials with lossy films look differently exactly?