Will cycles get dispersion effects?

I played with another external renderer i think it was either luxrender or mitsuba, and it had this great dispersion effect in which for materials like glass it could cause the white light to disperse into the entire color spectrum, like a prism does, it could also be activated to simulate the colors on the surface of a bubble or oil. Is it possible in a simple path tracer like cycles?

there’s a way to “fake it” with a node setup, cycles cannot do it normally because it only shoot rays from the camera
here’s a tutorial from cgcookie : https://cgcookie.com/lesson/dispersionand here’s a shader to buy
https://blendermarket.com/products/prism—fast–advanced-glass-shader-for-cycles

The reason why Cycles cannot do it normally is because it’s an RGB engine and not a spectral one.

Redoing the way Cycles does lighting and shading so as to be spectral-based would make an implementation of dispersion straightforward, but it will likely come at the cost of longer renders and chromatic noise (though there has been a few recent papers that help with that).

This.

Slight addendum: Ace Dragon’s comment essentially encapsulates a “rewrite” of Cycles. This will probably never happen unless Brecht or one of the other Cycles devs gets handed a truckload of funding. Even then, unfortunately the usefulness of dispersion effects are pretty limited and 90% of that kind of stuff can be faked either via clever material setups or in compositing software. It would probably take less time to fake it in Nuke than to get a clean render outright.

While I agree that its really a feature that isn’t super useful, I hardly ever use it at work using Arnold which has pretty fast dispersion, is it so that it “must” be a spectral renderer? I did not think either Renderman or Arnold were. Or does the transmission shader here interperite RGB rays as spectra? Or are these approaches not pbr?

Mental Ray 10 years ago was able to do dispersion (same with the original Yafray). There was even a patch once that would’ve allowed BI to do dispersion, but it by no means indicates that those engines could do dispersion as accurately or with the same quality as a spectral renderer (for instance, RGB renders might simulate a finite number of colors and combine them for dispersion effects while engines like Luxrender sample for any place in the spectrum).

RGB renderers can simulate dispersion by having slightly different IORs for red, green, and blue. I believe that this is what Arnold is doing (along with their usual special sauce in the form of importance sampling wizardry). But to get true dispersion you need spectral sampling, and to get true dispersion in any reasonable amount of time you need MLT, bidirectional tracing, or some combination of the two, which instantly takes you out of the field of production-ready renderers.

RenderMan can make a spectral render when using physical materials, for example, when defining a metal and using it as physically accurate, you define the absorption color, rather than the reflected color. So it is spectral, and Pixar’s documentation mentions it several times. Spectral rendering can define the dispersion of the colors based on IOR and wavelength of light, where RGB is kind of a fake.

Might be relevant to the discussion:
https://twitter.com/Dade916/status/949006511182884866

You can achieve dispersion a number of ways in cycles. The first and easiest method is to add three glass shaders. Each glass shader is set to either R, G or B - and the IOR is set differently for each one. This method is quite slow because you are using multiple glass shaders.

The second method is to add a procedural texture - then drive the RGB and IOR values through it. If you make the texture size small enough - you’ll get the dispersion effect without actually seeing the texture. I also attempted to make it ‘kinda spectral’ by converting the texture greyscale values into wavelengths in order to get a better colour continuity. It also means the IOR is directly correlated with wavelength too.

Because you only use a single glass shader, not three - it works out about 30-50% faster to render too


3 Likes

That is bloody clever haha thanks Moony !

Dispersion does not need a renderer to be spectral one in order to work. Both V-Ray and Corona have dispersion, yet neither of them is a spectral renderer, so it’s a false assumption. Dispersion effect is actually quite simple. When you hit a surface, instead of one refraction ray, you shoot 3 - red, green and a blue one with very slight difference in IOR angle, based off the Abbe number. It’s similar to doing glossy refraction, where you also shoot more than one refraction ray.

So no, no need to fundamentally change how Cycles works. All it really takes is about 1 week of one skilled developer’s time. However, dispersion is just so small and niche feature that it probably doesn’t have much priority, and that’s why we don’t have it yet :wink:

It’s also very easy to replicate using nodes.

That’s not true dispersion, and what you described can already be very easily done via the node system. Real physically-based dispersion does indeed require spectral wavelength sampling across far more wavelengths than simple rgb.

It’s still kinda doable. I converted a range of wavelengths onto their respective RGB values using an online wavelength to RGB conversion tool - and added these values to a colour ramp. I then use the brick shader to generate random greyscale values that are modified using math nodes to convert into wavelength numbers. These wavelength numbers are then ‘looked up’ on the colour ramp to drive the colour.

A similar thing happens with the IOR - but this time the math nodes are controlling the base IOR and the dispersion amount.

By using a setup like this - you are essentially generating a continuum of colours corresponding to wavelength - and you are also generating a specific IOR per wavelength too.

This IMO gives a much more pleasing and gradual dispersion effect than simply mixing raw RGB values and three IORs

Note: I have chosen to go down the colour ramp route - because the inbuilt wavelength conversion node in blender doesn’t seem to deal with violets very well.


1 Like

Moony could you share the blend file?

How do you define true dispersion? What I said is what the production renderers internally do too. It’s the principle of the dispersed light transport. While spectral renderers may do it slightly more accurately, I can bet you would not be able do visually tell which scene was rendered with a spectral renderer, and which one was rendered in a non-spectral renderer that supports dispersion (Like Corona or V-Ray). The images side by side would likely look slightly different, but by no means one would look significantly better than the other. What’s the point of creating more complicated and more limiting renderer when the result is indistinguishable?

Have you ever noticed that there’s not a single spectral renderer widely used in production? :slight_smile:

My reason to contribute to this thread was simply to call out false assumption that a renderer needs to be spectral in order to give users ability to render refractive materials with dispersion at a quality level sufficient for production. It most definitely does not. It sounded more like an excuse. Believe it or not, even the renderers used by people who do high end visualization of jewelry for living are in vast majority of cases not spectral ones.

But we need to think to the future, be forward thinking, as there will be vast changes in the world of computers in 5 years or even beyond, hopefully in the form of AI, the end of silicon and the transition to many new paradigms of computing eg, neuromorphic, graphene or carbon nano tube base microchips, photonic based chips, 5G cloud computing, quantum chips and tensor processing cores. These changes hopefully(lets all hope) will land us with VASTLY more computing resources to throw at things like rendering engines, and when the devs know they have all this excess power i think they will jump at the chance to go full spectral, bidirectional or photon mapped raytracer for cycles, and probably in real time whilst full editing is happening. :slight_smile: Because why not.

…if we had future technology that could render that much faster, we could just throw 10 or 100x more samples into a path tracer and brute force our way through it. Simpler code on more powerful hardware can be more efficient than extremely complex code.