Non colour data flagged buffers specifically keep it out of OpenColorIO colour transforms. This is a nuanced thing that not many people think about, but makes perfect sense once you realize there are two general classes of information. Blender has had a seriously deep design flaw that leads to other bad flaws.
This very likely stems from a confusion over what “linear” means. “Linear” can be a facet of both data such as depth, normal, displacement, etc. and colour. Nonlinear can be a facet of both data and colour as well; it’s perfectly feasible that there may be a log based data encoding, just as there may be a log based colour encoding.
Interestingly, it is also very possible that work requires transforms on data as well! Imagine having a small output referred encoding of
0.0:1.0 range, encoded in a TIFF. The actual data may require
-10:+10. A Depth encoding from zero to infinity, which is very difficult to encode in an output referred
0.0:1.0 encoding, could be more gracefully handled via a somewhat trivial data transform. OpenColourIO also can indeed handle such a transform, and keep it “outside” the colour transformation chains for Displays / Views for example.
The seriously deep design flaw is mostly hidden currently in Blender, but having understood the above, is quite clear. When OpenColorIO was first integrated in Blender based off of Xat’s implementation, someone made the poor decision to encode the file based off of the currently selected view. This is an absolutely awful idea, and in fact contributes to some of the insanity you see forwarded. The original intention of Xat was to have a separate file encoding set of transforms, not be based on the currently selected Display / View combination, but rather keep the viewing and the output / file encoding quite different. TL;DR: Blender screwed up and wound a degree of file encoding up into the viewing encoding.
With all of that said, file encoding in Blender should honour the “isdata” OpenColorIO flag, and encode the data according to the transform listed, which in the case of the “Non-Color Data” transform, is a linear to linear ratio no-operation. The “isdata” flag in OpenColourIO is precisely for this sort of transformation handling. If Blender mishandles this, it’s a bug, and needs to be reported.
There is another side of this worth noting, which is the visualization of the data. A good example is Alpha, where we may want to see the units of occlusion as a percentage. How do we display this? That is, most folks likely believe they are “looking” at the alpha as visualization of the linear percentage ratio. What should 50% occlusion look like? Should it output 50% “brightness” in visualization? If we view it as the “linear ratio”, alpha code value
0.5 for example, would be rolled through the hardware of the display’s particular transfer function inversely. In the case of an sRGB display, this means that your “linear ratio visualization” is actually the
0.5 data, rolled through the hardware’s
output_transfer_function(0.5) which sort of happens to look correct! However, we can see from this simple example, we have several facets to consider:
- Data and colour encodings. Both can be nonlinear, both can be linear, both can cover arbitrary ranges. Software needs to permit the pixel pusher enough granularity to tackle this complex and nuanced situation.
Visualization of the internal colour and data models need to be considered. How do we want to visualize normals? Depth? A False colour for depth is a useful visualization, as are alternative visualizations for other data sets.
- Consider the output device. Always. As Doug cites, having your maximum occlusion alpha value of
1.0 blaring out 4000 nits peak output from an HDR capable monitor is far from ideal, nor intention. And this doesn’t even begin to scratch the surface of actual colour representations.
More context can be read via this link, with a bit of feedback from the current team working on the V2 branch of OpenColorIO.