A complex approach - Iridescence in cycles

Actually, the wavelength node gives you already the color. And for the thin-film problem where the width of the film goes from 200nm to 800nm, the results of the wavelength can be considered accurate. Since those distances only influence just one wavelength.

i’ve been messing around with the math, and to find the distance travelled by the ray, i’ve been using this formula:
Distance=width2tan(sin(acos(I.N))/IOR)
where width is in nanometers and N is our imposed normal (not the surface normal).

But things get a bit complicated when our width is far bigger( even worse if we have multi-layers), and we start having multiples of some wavelength combined with multiples of others wavelengths.
I still haven’t found a way to do this… maybe OSL could be usefull for this…:confused:

I’ve been experimenting with superimposing multiple orders of wavelengths from 1 through to 4, but I still have no way of computing the necessary things purely because I can’t access the “light ray”. If we could use the vector that goes to the light sources then it would be much easier to compute this whole thing.

Also, this here explains it pretty succinctly: http://www.gamedev.net/page/resources/_/technical/graphics-programming-and-theory/thin-film-interference-for-computer-graphics-r2962

I think OSL might be handy for this unless we go into multiple layers in which case, as said in that link, we need to use matrix multiplication methods which we have no way of doing through nodes.


That’s the best I can get it to work (for now :D) and you can see the complexity of the node setup. It’s ridiculous. At least it works for all thicknesses and such. Doesn’t work for multiple film interference though, that’s like impossible with the nodes unless you creating something three times as complex as that.

that was exactly my initial thought, and the only ways you can do it is either code a new BRDF (the final objective, just as we have velvet or anisotrophy), hardcode the light position in the shader (not pratical at all, but fast results), or the trial and error that i explained in my first post (computational expensive).

By the way, those papers from gamedev.net seem quite interesting. I must read them with a fresher mind.

for now here’s my lastest test. it’s still rendering and after 12000 samples it still has some noise in the iridescent layer.


I’m still thinking how could we calculate the color for big thicknesses, as this is almost like the primality problem… Finding the multiples of every wavelength in a specific distance…
How did you overcome this? did you use the technic described in those papers?

1 Like

When it comes to hardcoding stuff in the source itself, you may either need to code a new interference/iridescence BSDF or code a new vector type that gives the angle from a surface to the known emission sources.

Another possibility that could help is coding a ‘ray angle’ mask type that gives the maximum angle between the direction of the ray and the direction of the normal along with a ‘light ray’ lightpath flag that is given to the final bounce of a ray (before it hits the light).

In a way, the BSDF solution would be a shortcut to these type of effects while the vector/ray data suggestions would allow for a universal application across all of the shading types. Those who already have a copy of the source and know how to build it can use this commit as a guide should one decide the BSDF route. Anyone who’s curious and don’t have a compiler on their machine can dig around the source code here (though I have no idea where they keep the bulk of the lightpath and vector information)

The sphere looks awesome - like moonstone.

Yes, the idea is to have an interference/iridescence BSDF in the end (with volumetric support would be the cherry on top :D). Mainly because cycles already tries to throw a ray to a light on every bounce, which minimizes the need for so much superfluous thrown rays that, for example, i use with this ‘blinded’ technic.
But before jumping into that, i believe the algorithm for making the interference calculations still has a long way to go.
Not only in the math involved, but also to design an interface that can be simple enough for people to use it, as this effect can get quite complex in terms of variables.
Once we get it work ok, then it won’t be difficult to implement it in cycles (i believe… :confused:)

I’m going to read the papers that Microno posted, to search for good solutions for the multiple wavelengths problem.
(it will take me a while, with my rusty head :))

@moony, thanks… it still misses a lot of inner reflections to make it exactly as a moonstone… But we’ll get there :wink:

Yes, the idea is to have an interference/iridescence BSDF in the end (with volumetric support would be the cherry on top :D). Mainly because cycles already tries to throw a ray to a light on every bounce, which minimizes the need for so much superfluous thrown rays that, for example, i use with this ‘blinded’ technic.
But before jumping into that, i believe the algorithm for making the interference calculations still has a long way to go.
Not only in the math involved, but also to design an interface that can be simple enough for people to use it, as this effect can get quite complex in terms of variables.
Once we get it work ok, then it won’t be difficult to implement it in cycles (i believe… :confused:)

I’m going to read the papers that Microno posted, to search for good solutions for the multiple wavelengths problem.
(it will take me a while, with my rusty head :))

@moony, thanks… it still misses a lot of inner reflections to make it exactly as a moonstone… But we’ll get there :wink:

Downloaded the blend from earlier in the thread (complex nodes :D) - but I have a question:

What is the noise texture doing. I unplugged it from the node setup - and the pattern of colours remained largely unchanged (they just got brighter and smoother).

Hi again, I really don’t know the math behind this type of effect, but I did manage to create something that could be seen as useful for this thread.
Cycles_spectral.blend (556 KB)


My approach here was to do the old RGB add trick with the velvet nodes and add a glossy shader on top, I used velvet shaders because of their unique way of creating highlights.

@moony for what i can see in the file from Microno, the noise serves to create some roughness in the iridescence layer. changing the values in the bump node can give interesting results.

@Ace Dragon, Veltet can be quite usefull for some specific effects… I haven’t thought about it, but it camed to mind that anisotropic glossy can also be very usefull.

In the meanwhile i’ve finished the thinfilm shader, and i think it’s quite accurate. Here’s a image with soap bubbles:


and here’s the blend:SoapBubble.blend (1.32 MB)

I haven’t included the background image, as i don’t know the permissions for it, but the blend has the width map packed.

The width of the bubbles vary from around 30nm (mostly on the top) to 700nm (more at the bottom). There’s also a small trick to flip the ior in the backside of the bubbles, as the ior there is still the same as the outside one.

For the Iridescence group node, you can use it for a lot of different situations. I’m planning to keep testing different values to aproximate other materials.

1 Like

Another type of iridescence I wanted to recreate was Diffraction Grating… Which we can see most commonly in cd’s or holographic tapes.
So, based on chapter 8 from GPU gems, i’ve build a shader that can replicate this effect.
Here’s an example where i tried to mimic the material from the package of a tooth paste (:D). I’ll post an animation tomorrow (still rendering :)), where the effect is best seen.


As you can notice in the blue to red gradient around the specular center, the diffraction is dependent from the light direction, as it is supposed.
My shader is still a bit messy, not yet accurated, as I haven’t quite managed to get the correct reflection angle from the grating size, but most of the math involved is already in place, and i hope tomorrow to be able to share it with you.

edited: Just added the video ;). Here it is:

2 Likes

It took me a while to set the nodes working, but here’s a working result.
DiffractionGratingGroups.blend (753 KB)

I decided to have this structure because when the grating distance is low, there is no point of having lots of orders. Also, for some specific needs, one should add more frequencys (the DiffractionGratingeLambda node) to the DiffractionOrder.

The rest of the node groups, are just utility nodes, one for quartenion multiplication, one for vector rotation, and one switch for vectors.

How does one use this node group. I tried it out on the cube in the file - but all I got was black cube with a very faint specular type reflection.

I have seen it, and i still advise everyone that don’t require a so physically based result to use your method, as the result is pretty good for a normal project. (specially if you have deadlines ;))

Basically this works as an extention of the reflection, and should be mixed with a glossy or an anisotropic shader. But for a better understanding how to use the nodes, i think it’s better to explain what’s really happening :eyebrowlift:!

In Diffraction Grating, besides the normal reflection, the interference created by the slits on the surface, result in bands that are visible in directions that don’t correspond to the reflected vector for that surface normal. And to make more complex, each wavelenght from a light ray gets reflected in one particular angle… (well, they get reflected in all directions, because it is still a rough glossy surface, but the interference makes them only visible from specifics positions).

Because the wavelenght is one of the major variables in the resulting outvector, this makes a rainbow of colors, where low wavelenghts are more close to the true specular vector (that is actually the Order Zero), then higher wavelenghts. Also, because the wavelenghts are cycling, there can be a second order of diffraction and a third, an so on, each of them represented by a rainbow.

The number of orders depend basically from the distance from each slit. Low values (400-1000) give just 1 or 2 orders, higher values give more. So one should take that in consideration when using the node setup.

From my last experiment (before my computer broke down :() I found that using the DiffractionGratingLambda for each wavelenght, is quite memory expensive, and has the problem that, if we are not using a big anisotropic effect, the rainbow appears broken, jumping from one wavelenght to the other.

A better approach was to input a random value into the wavelenght slot, and use just one DiffractionGratingLambda node for each order.
This can be a bit noisy with low samples, but it’s faster and more accurate.
For the Randomness one can use a noise with a very high scale, and then multiply by 600 and add 300… This should give values from 300 to 900, which is around the visible spectrum.

Another thing of importance, is the vector for the grating (the tangent vector). It’s around this vector that the shader transforms the new normal, and for a good control of the final look, it’s good to have this correctly setup.
(i’m still installing a new OS in my pc, but as soon as possible i’ll post some usefull materials)

but if anyone still want to experiment, here’s some usefull info:
-CD Material (just the metalic part):
Distance is around 1600, N can go up to 3 or 4 (one node for each), Tangent can be the default. And everything mixed with an anisotropic shader.

-Holographic Material
Distance is around 700, N can be up to 2 but 1 is ok, Tangent is variable. One can use a voronoi cell for the rotation of the tangent, or if for example you are trying to replicate the Holographic patterns in a 50€ bill, then a Texture map should be made. With a lamp, you can check the tangent vector of each shape in the hologram and accuratelly reproduce it in blender :).

-Feathers and Butterfly wings
distance can vary from 400 to 900. Just the first order is enough. But the tangent should be more elaborated. Low values for the distance, result in a blueish velvet specularity.

I’ll make some graphics explaining a bit more how to control the results as soon as I have my pc up and running.

Finally got everything running here! :smiley: five days without Blender and i was already driving mad!!

So here’s a file with 2 simple scenes showing how to setup the nodes.
Diffraction_examples.blend (986 KB)

and a preview of one of the scenes.


I also added a color input to the nodegroup, to give more control in the results.

I had to remove the hdr background, because BA wasn’t letting me upload the file with the image packed… But I’m sure you all have some nice envs to put in :slight_smile:

I’ll start now making some fancy graphics explaining the behind the curtains of everything…
Happy Blendings

@moony, that’s a very ingenious approach. and far more practical for some fainter results.
I’ve found out that my way has some flaws. most of them because i’m dealing with statistics, but the noise input isn’t giving me uniform values from 0 to 1. And this makes every result more greenish than it should be…
I’m on a deadline now, but next week i’ll try to level this out.

After looking again to my previous node setup, and making experiments with CDs (which have a known slit distance of 1600nm), I’ve found some errors in the math i’ve use. So for those of you who have downloaded the file from post#33, you can now replace it with this new one Diffraction_Nodes.blend (716 KB).
There are a lot of changes, but the main one is the VectorRotation node that was previously based in Quaternion multiplications, and i’ve simplified it alot to make it faster.

So how to use this nodes…
First we must understand what diffraction really is.
when Light hits a surface that has a grating, each slit will bounce the light in a circular pattern, that will interfere with the bounced light from all the other slits, and the result is that each wavelenght will only be visible from angles that have the following characteristics: Distance * sin(LightAngle) - Distance * sin(CameraAngle) = Multiple of the Wavelength.
This produces the following effect:


where the order number are the multiples of each wavelenght. The zero order corresponds to a regular reflection, and all the wavelengths are reflected in the same direction [sin(LightAngle)=-sin(CameraAngle)].

In the image above, the grating distance is 2000nm. Smaller values make the order separation wider, and bigger values make the orders closer to each other. (producing more visible orders!)

In my approach, I used the same formula, but instead trying to find where the light goes to, I’m more interested in finding where the light can come from. With a bit of algebra, we get that for a specific wavelength and CameraAngle, the light angle must be one of the following possibilities:

(continues)

1 Like

To sample each of this directions, I use a fake normal connected to a glossy shader. This way it’s possible to control the direction to sample, since a reflection angle is allways the negative of the incoming angle. So to sample a direction, I just need to have a normal that is exactly in bettween the incoming and outgoing directions.


As you can see in the picture, Phi = (Beta+Theta)/2, and this is the angle I use to rotate the Normal vector.
The axis of rotation is the tangent vector. So this has a great influence on the ending result.
I added a Rotation2Tangent node, to be more practical. The input works as the anisotropic rotation, so 0 means 0º, 0.25 corresponds to 90º and so on.

Unfortunatly, we still have to sample each of the possible directions, for all the wavelengths. In the future this can be solved with OSL, when they implement Closures and Integrators in the language! :slight_smile:

In the blender file there’s a ‘OrderSolver’ node, that will try so sample a specific order. The ‘OrderSolver’ is composed by 5 ‘WavelengthSolver’ but this can be changed… for example, in the image I’ve posted, you can see that in order -1 one just needs the red, green and blue wavelengths, as they are close together… but order -4 one may need to sample other wavelengths to get a better spectrum…

There’s also a ‘Diffraction_Example’ node to show how to assemble the nodes together.

and finally, here’s an example of what the shader can do:


4 Likes

jaw drops

Spectacular!