Cycles Spectral Rendering


(smilebags) #1

UPDATE:
Since the start of this thread, significant progress and changes of technique has occurred, including moving the entire workflow inside of Blender, and using correct models for doing the conversion to RGB. A lot of discussion happened regarding colour spaces and accurate conversion to RGB from spectral images, which is the foundation of this workflow.

I am currently working on an addon to allow this technique to be used seamlessly within Blender, including animation renders, and regular use of the compositor. There is still significant overhead since Blender decides to re-build the BVH between each render layer even though the geometery is identical, but I’m hoping to get to the point where anyone can use this process in their workflow without too much extra work.

Here is an example image created with the latest iteration of the workflow:


This is a continuation of my previous thread which has since disappeared.

Here are some tests I’m doing with Cycles and a Spectral Rendering process I devised. While it is only marginally slower than standard rendering, the biggest drawback/struggle is actually getting spectral data to use for materials.



This shows a bubble using thin film interference, and a steel ball which has been heated, causing an oxide layer of varying thickness. Both of these are quite hard to accurately create in RGB, but come as second nature in spectral rendering.


This was the first test I did as a proof of concept, showing two different light sources (one blackbody and one fluorescent), in-camera chromatic aberration, and dispersion in glass. All of these are very easy to create in this workflow.

I’m yet to do a full scene because I haven’t found a way to extrapolate standard RGB colour data into a usable spectrum yet, but using black and white textures works just fine. Having a proper understanding of light does help with this as well, but it should be simple enough such that anyone could learn it in half an hour or so.


(smilebags) #2


Iridescence test - car paint
Less than 5 minutes render time on laptop CPU.


(smilebags) #3


This is glass with physically accurate values. It is very subtle, but could become visible in some cases.


(Atair) #4

Looks very interesting! Can you give some background information on you workflow? Especially how you feed data into cycles?


(smilebags) #5

Sorry the response is so long. It might still be a bit vague because I’m skipping over quite a bit still. If you’re interested in learning it, I might make a video on it.

Thanks! So the main breakthrough with this was working out how to simulate light of a specific wavelength. I did that by creating a node group with a keyframed ‘value’ node which goes from ~390 to ~700 throughout the duration of the scene animation. The shift of thinking here is to use each frame as a specific frequency of light.

Then, each of the materials need to contain that “Wavelength” node group. Using that value, you can then create materials which respond to different wavelengths differently. This is the node setup for the glass, probably one of the easiest to understand.



The leftmost node is the ‘wavelength’ controller which gives out a value between 390 and 700 depending on the frame. The second output is the same values, but mapped to between 0 and 1 for ease of use. I take that value into the colour ramp and drop it slightly on one side. This makes the glass slightly opaque to shorter wavelengths (not sure if this is realistic but I was just guessing - finding real data is hard).
The “Mapping” node is another node group I made which simply takes a value in, which you know will be between “InFrom” and “InTo”, and gives out that value mapped to the range between “OutFrom” and “OutTo” - in other words it takes 0 to 1 and makes it 1.524 to 1.5. These are the values for the IOR of glass at the corresponding wavelengths. At 390nm, glass has an IOR of 1.524. At 700nm, it has an IOR of 1.5.

What the resulting render will look like now is glass which is slightly more clear at the end of the animation compared to the start, and with a slightly lower IOR at the end.

Then you take all of those frames, put them into photoshop, change the layer mode to “Add”, and then multiply each layer by the corresponding colour. That might be easier to understand from looking at a picture.
[ATTACH=CONFIG]489150[/ATTACH]

As for the other materials, they are all just different ways of using the wavelength parameter to achieve a brightness value. Some are more complex than others, but they all follow the same principle: create an entirely white image which represents the scene from the viewpoint of a particular frequency.


(Martynas Žiemys) #6

This is amazing. But… This seems to be a bit complicated to integrate into a usable workflow. At least from the first glance. Fresnel curve might also be different for different wavelengths. I don’t think a lot of measured data exists for that. Have you seen this https://refractiveindex.info/ ?


(smilebags) #7

Yes, it is entirely ridiculous for any production project without sufficient resources.
You’re right, Fresnel would change depending on the material and wavelength, but for now I’m using the standard one since so much work has gone into making it already, and the added benefit I would get is negligible.

Cycles isn’t a spectral renderer - this is half the issue. The other issue is to do with all renderers: there’s simply no way you can make these materials easily without getting real world spectral data. I’m looking into getting a cheaper spectrometer to use for this, but that might be a while away.

It is very complex at the moment, because I’m using cycles for something it really wasn’t meant for. If cycles was aware of wavelengths and could combine them itself, it would save almost all of the painful work I need to do to get it to work. Honestly, I do see it being usable for production in Cycles with sufficient development. Obviously it isn’t for everyone, but spectral rendering already is overkill for 99% of projects.


(smilebags) #8

That website looks like a great tool, thank you!


(Martynas Žiemys) #9

I don’t think it’s overkill. I love easy node workflow with cycles but it’s lack of physical accuracy is starting to get in the way for the needs I start to have at work. Spectral renderers are all still approximations but they are getting a bit closer to reality. Maxwell and Octane seem to be a few steps ahead because of their ability to simulate physical effects better with spectral rendering.


(smilebags) #10

I agree, Cycles is lagging behind in that regard because the features simply aren’t there.

Something I’m yet to get my head around (I haven’t used spectral renderers more than playing) is how they extrapolate an RGB image into a spectrum to sample. Do you have any experience on this?

For example, a light source which is coloured based on an image could simply emit lots of three bands of light - at red, green and blue - but if they do, other spectral materials in the scene won’t respond very naturally to this. I don’t understand how they do it and keep it looking natural.


(smilebags) #11

Sorry, I meant to say custom spectral material creation, rather than spectral rendering. Spectral rendering is great!


(Martynas Žiemys) #12

RGB tristimulus values need to be converted to matching one color wavelength or some sort of range… I have no idea to be honest. As far as I understand the process will not be perfect, because all the spectral data will not be saved in RGB, but this is best that can be done I guess. I am not sure how they do it in Maxwell or Octane, but at least some smart people must have been thinking a lot about this while coding the renderers. It must be at least a bit closer to reality the way I think.


(smilebags) #13



Here is a comparison between a ‘flat’ light, like a blackbody light source but currently with too much blues, and the same scene using a fluorescent light. It is a quite obvious difference. The Flourescent scene (right) has been colour corrected on its left side. Once corrected there is little difference.


(smilebags) #14

Yeah, that’s about as far as I got with the understanding - using some sort of lookup to create a fake spectrum which will render out the same in RGB while still containing a variety of frequencies. The colour theory and math involved in that is a bit beyond me, but if that was implemented, then all the Blender team would have to do is give us a button to turn it on, a node to reference the wavelength (when desired) and then a spectrum to RGB transformation at the end of the render. It really would be lovely.


(Martynas Žiemys) #15

Maxwell and Lux deal with single colors, they must have some way of converting them to wavelengths that is not apparent, but Thea has some tools exposed to user: https://s11.postimg.org/694kswxhv/Capture.png


(smilebags) #16

Nice! I’m surprised how similar my setup is when looking at that - almost always I go from the wavelength node to a curves node for normal materials. I do like the flexibility of nodes, which allows for crazy math when desired. I wish Cycles could just add support for it…


(smilebags) #17

I love how they show how it’ll look with different light sources.


(Martynas Žiemys) #18

Thea seems to be a good renderer in general. It just lacks good documentation and better marketing I think. I wish to try it out a bit more everytime I think of it.

Hey, I think you might be interested to look at this as well:
https://blenderartists.org/forum/showthread.php?422576-Secrop-s-Cycles-quot-Loop-quot-node

You could mix all the wavelenghts in one shader. That could lead into a bit more practical workflow, but it would probably have negative impact on performance since you would have a lot of shaders in one, that loop thing could maybe avoid that. If you are up for it, you could explore this.


(smilebags) #19

That looks like it could be worth investigating. I’m not quite sure how it would work, but if the pieces fit together right it could save a lot of effort. Thanks.


(moony) #20

This looks very interesting.