Cycles Spectral Rendering

Good point, but last time I reported a Cycles bad design choice as a bug it just got taken down straight away.

This is actually going through Filmic with high contrast, no gamma adjustment.

It just needs to be framed as a crucial node. There are plenty of design choices that can be excused for historical reasons, however, very few node such as an OCIO node would have such a tremendous impact on workflows.

Very hard to demonstrate just what sort of impact without first having the node.

1 Like

Agreed. Who would have the know-how to actually implement it? How long is the road to a pull request with the feature already implemented? I guess the person would need to understand how compositor nodes work, where the code behind them is, how to compile blender (on windows it seems near impossible) and how Blender interfaces with OCIO.

Is there maybe any other software you could refer to in order to at least roughly hint at such a demonstration?
Iā€™d also love much better color management to be implemented.

1 Like

I suppose OCIO node(s) gives use the chance to operate on scene-data (not display-data).
I thought there is quite some software having nodes under the category OCIO. Natron, Nuke, Fusion, etc. There is even an addon for GIMP I believe. Not sure if that is related to smilebagsā€™s plain xyz data thoughā€¦

Youā€™re right. Natron is a perfect example of why OCIO is important for any serious compositing work, especially when working with camera footage alongside rendered elements. Currently, it seems like the best someone can do in Blender is just eyeballing the result. Because compositing has an aspect of an art to it, I guess they assumed that is enough.

XYZ is just another way of representing colour, so yes, is it all part of the same thing. Having proper colour management just means tagging all colour information in the program with a profile, so that you can accurately convert colours. An OCIO node would let the user have an easy interface in deciding what the colour channels mean. Different colour transformations are also simpler in different colour spaces. An easy example is how shifting hue without changing saturation is very easy in HSL style spaces, but pretty challenging if you were to attempt it yourself in RGB.

Every UI element needs to be managed largely because of the variability of reference spaces. That is, imagine that the three lights in the reference spaces are different from the display output?

There are also other issues regarding the expectation / need versus how the scene referred linear reference is formatted. I wrote up a simple example that anyone familiar with Filmic should be able to understand. Because the pixel pusherā€™s need varies across context to context, the interface needs to be flexible enough to choose amongst various transforms.

Great write-up on how the curves could do with some work. I think it is all part of just properly managing colours in Blender throughout. Anywhere colours are displayed needs to be colour managed for this to really be of much benefit. The ability to select the response curve on the curves node is one thing which would make peopleā€™s lives a lot easier. I guess it is that not many people really care about using the correct OCIO transforms because they donā€™t understand why it is important. Not many people properly understand colour, so thereā€™s that too.

Along with this, it would be nice if render results were also tagged with a colour profile, including on export. The compositor output could even have a dropdown similar to in your Curves write-up, allowing us to assign a profile on the output.

Tagging a file is a relatively trivial thing, and remaining untagged isnā€™t dire given the reference space is REC.709 based.

The short term one two punch might be:

  • Manage the passive UI (icons, borders, theme colours, etc.), and provide full management / transform selection on UI.
  • Provide an ā€œAdvancedā€ stack for custom transforms on file encodings. This would allow for full transforms on EXRs, various look outputs per shot, etc.

Shorter term is the glaring absence of an OCIO transform node.

Iā€™m not sure what you mean by that. By ā€œthree lightsā€, do you just mean the RGB (or what ever) primaries of source and target color spaces?

In reading through that entire Color Management section, Iā€™m pretty sure that, while there are definitely still a ton of problems, at least some of those issues are now fixed in the latest builds. For instance, Lukas Stocknerā€™s patch made it into Blender already, right?

Iā€™d love to see what exactly is still necessary vs. whatā€™s been accomplished. (Although Iā€™m guessing most of it is still necessary)

Yeah, to be honest, most explanations of it, that Iā€™ve seen thus far, are on the obtuse side of things, which is unfortunate. Iā€™m pretty sure itā€™s not actually as complicated as it seems to be made to appear.

Which patch of his will help with colour management? Currently, blender isnā€™t really colour managed at all. What is still necessary is the whole thing. Separate to that is having an OCIO transform node.

Yes! The input has three lights, the reference has three lights, and the output has three lights.

RGB is a relative encoding model. This means that taken as values alone, any given RGB triplet doesnā€™t mean a thing; we need additional metadata to give the triplet values meaning.

In terms of what an additive RGB colourspace is, according to the ISO, it consists of three components, two of which are tightly bound together.

  • Transfer function(s). This is the ā€œWhat do the encoded values mean in terms of intensity ratios?ā€ The ground truth here is some idea of linear light, either display linear (minimum to maximum) or scene linear (zero to infinity).
  • Colour produced when R=G=B. This is known as ā€œwhite pointā€, and is the colour that is produced when R=G=B in most stable RGB colour spaces. Tightly bound to the last point below.
  • The colours of the RGB lights themselves. This is defined according to an absolute colour science map.

That is, when you read about someone wanting to ā€œGo from the RGB colourspace to the CMYK colourspaceā€ they are talking pure rubbish; it is impossible and is a meaningless statement because RGB and CMYK arenā€™t colour spaces.

The easiest way to understand what RGB is as an encoding model, is to understand how a raycasting engine works. Many people understand that a raycasting engine casts a number of single rays of light into the scene from a given single pixel position, multiple times, and iterating along for the full image dimensions. This is falsehood. It is actually three lights per pixel that are being cast into the scene! Likewise when we are projecting an image at our eyes in Photoshop, we are projecting three lights at our eyes, through the display, from the internal reference model. When we determine the magic of what resultant ā€œcolourā€ is produced, it is a byproduct of how each single, never changing coloured light interacts with the various surfaces[1]!

So the simplest way to understand the basics of pixel management in an RGB system is that the RGB values are light ratios of three lights. Without metadata of some sort, the light ratio mapping is unknown and the colours of the three lights are unknown. When we say things like ā€œsRGBā€ or ā€œACESccā€ etc., we are giving meaning in the absolute sense to what any given RGB ratios mean, including the transfer functions involved, the colours of the three basis lights, and the white point!

So when anyone gets confused about what pixel management is, always return to the idea of three lights and a transfer function and it should become clear. Can you ask the questions above and determine:

  • What transfer functions are at work?
  • What colour is produced when R=G=B?
  • What colour is each of the RGB lights?

This isnā€™t entirely accurate. The UV Image editor is colour managed and you can load up an arbitrary buffer and display it correctly assuming the correct OCIO configuration. Some of the colour pickers are somewhat colour managed.

Cycles ignores everything and is not colour managed, and has, for example, hard coded transfer function assumptions from sRGB for any imagery that is nonlinearly encoded. EXRs pass through untouched. Everything set by pixel pushers in the UI is ignored by Cycles in terms of colour transforms.

For Cycles, the colorimetry of the input imagery is ignored, the reference light meaning is ignored, and the output is ignored; Cycles assumes and is hard coded to sRGB for all nonlinear imagery.

[1] And this is also the difference between a simple three light RGB raytracer and a spectral rendering engine; the math is largely (as proven by this thread) the same, the sole difference being the number of lights and their respective ā€œcoloursā€ being projected into the scene per pixel.

3 Likes

@smilebags you might want to update the curves you are using. I asked Scott Burns if he could create three curves that do not violate energy conservation, and he delivered:
http://scottburns.us/fast-rgb-to-spectrum-conversion-for-reflectances/

You can use the matlab program or just download the curves from his site. Here they are for convenience:

0.021592459 0.020293111 0.021807906 0.023803297 0.025208132 0.025414957 0.024621282 0.020973705 0.015752802 0.01116804 0.008578277 0.006581877 0.005171723 0.004545205 0.00414512 0.004343112 0.005238155 0.007251939 0.012543656 0.028067132 0.091342277 0.484081092 0.870378324 0.939513128 0.960926994 0.968623763 0.971263883 0.972285819 0.971898742 0.972691859 0.971734812 0.97234454 0.97150339 0.970857997 0.970553866 0.969671404
0.010542406 0.010878976 0.011063512 0.010736566 0.011681813 0.012434719 0.014986907 0.020100392 0.030356263 0.063388962 0.173423837 0.568321142 0.827791998 0.916560468 0.952002841 0.964096452 0.970590861 0.972502542 0.969148203 0.955344651 0.892637233 0.5003641 0.116236717 0.047951391 0.027873526 0.020057963 0.017382174 0.015429109 0.01543808 0.014546826 0.015197773 0.014285896 0.015069123 0.015506263 0.015545797 0.016302839
0.967865135 0.968827912 0.967128582 0.965460137 0.963110055 0.962150324 0.960391811 0.958925903 0.953890935 0.925442998 0.817997886 0.42509696 0.167036273 0.078894327 0.043852038 0.031560435 0.024170984 0.020245519 0.01830814 0.016588218 0.01602049 0.015554808 0.013384959 0.012535491 0.011199484 0.011318274 0.011353953 0.012285073 0.012663188 0.012761325 0.013067426 0.013369566 0.013427487 0.01363574 0.013893597 0.014025757
3 Likes

https://developer.blender.org/D2613 might be of help here.

In fact, Iā€™ve just been talking with him! Thank you for asking him, Iā€™m about to update the curves now.

Looks like it has been abandoned for now, though Iā€™m hoping something with similar effect comes to blender/cycles.

@smilebags Iā€™ve taken this here to avoid further polluting the main development thread more than it already is.

ā€œI feel the most natural spectrum to use for a white light source is D65.ā€

White point and how something is lit is a complex creative choice. We are constantly choosing among various achromatic points for what appears ā€œwhiteā€. This is why I was suggesting it is a moot point; whatever is chosen as reference white becomes R=G=B when adapted correctly.

Entire films and rolls upon rolls of stills photography has been lit with tungsten light or otherwise. D65 is no magic value, and in fact typically averages above clear sky daylight.

Regarding E equal energy Illuminant, I believe all textures must be converted to E prior to conversion to spectral to remove bias. I need to hunt down the citation on this, and sadly donā€™t have the time.

Good idea.

I canā€™t disagree with this. Iā€™ve worked as a photographer for a number of years and certainly the white point (chosen in editing) and light source play a big role in how the image looks, and is completely a creative choice. That being said, colour reproduction is heavily influenced by the quality (spectrum) of the light source. When Iā€™m referring to D65 here, Iā€™m talking about the spectrum itself, not the white point. As you know there are infinitely many spectra which give you the D65 white point, but Iā€™m talking about the standardised spectrum from the CIE.

Iā€™m willing to accept this as true, it seems to make sense.

CRI I believe you should reference then, rather than the colour temperature.