Reversing color correction.

Thanks for chiming in troy_s. Seems I have some reading to do. Meanwhile, a data agnostic OCIO node seems to be exactly what I am asking for…I think. For example, I often take RED files and render then from REDCINE-X as floating point .exr files in a log color space (because the white values would be clipped writing them out to an integer based file format like even 16 bit tiffs) and bring those .exr files (with floating point white values above 1.0) into my compositing software. The compositing software can make some assumptions about how the files are color encoded based on their file formats, but it doesn’t really know. I want control over how the images are graded to make them linear.

For example, I can write out a linear 10 bit dpx file. How can any software know I did that? It can’t really. If the software just assumes the data is log because it is in a dpx wrapper…well. I think you see what I mean.

In Nuke, I often load everything in sRGB and just grade it to taste, do my comp, reverse the grade and save it back out as sRGB and no one needs to be the wiser. The client gets back exactly what they gave me, the comp looks great, and everyone is happy.

As soon as programmers try and get fancy with their color pipelines, they become like Apple, trying to make everything “easy” for the common user, leaving the power users, who want base level control, on the sidelines.

But like I said, I have quite a bit of reading to do now. Maybe this functionality just needs someone to script a front end or something. I really, really wish I knew Blender better right now.

This is actually incorrect, and I believe REDCINE-X overrides any transfer curve settings to deliver correctly to the format specified.

The resolution of 32 bpc EXR exceeds the dynamic range of most camera sensors currently and as such, you shouldn’t risk any clipping with scene referred data. I am certain that there are some edge cases where HDR data may exceed the values that a 32 bpc EXR can provide, but it is a reasonably stable comment by and large. For an in-depth look at the format storage issue, this is a great reference: http://www.anyhere.com/gward/hdrenc/hdr_encodings.html (the server may struggle from time to time)

Further, EXR is a strictly linear format, which is why it is the defacto intra-pipeline format as it provides fewer areas for file mangling. So technically speaking, storing log in an EXR is violating the format’s standard.

Exactly. Some of this can be stored in the DPX metadata, but even then it would be up to the artist to interpret it correctly. Hence EXR, where despite an artist not knowing the color primaries or transfer curve, the data format must be linear. This at least reduces one possible point of mangling.

I don’t frequent here too often, so if you hit a speed bump, feel free to email me. I’ll do my best to get you the help you need. Also, there are a growing number of folks around these parts that are becoming more and more familiar with proper color pipelines and OCIO, which is great.

Anyways, wonderful to have another person around these parts interested in these sorts of things.

With respect,
TJS

Sorry Troy. I misspoke. I do not render the exr files in log color space. I think I was just in a rush when I wrote that. However, I do render them out of redcine-x and the metadata embedded in the red file travels to the color depicted in the exr, so what was shot on set looks like what I get, and there are a lot of RGB values above 1.0. However, if I output dpx from redcine-x, the whites do, of course get clipped. I hope that clears up my misstatement. (DPX should be considered a dead file format, as far as I’m concerned, which is silly wishful thinking.) Not only that, I think too much attention has been put on the r3d file format already in this discussion. I’m suggesting a file format agnostic tool here, as you earlier pointed out.

Ultimately, I’m just asking for a node in the blender compositor that works like nuke’s grade node, or the log/Lin functionality of shake or even after effects or the now defunct Combustion that you can reverse at the end of a comp right before your write node. I really wish I knew blender well enough. I’d just write the thing myself.

And thanks for your input Troy. I also am new to navigating the waters of who’s who in this Blender world and I expect to make many missteps. I thank you all for even indulging me, the new guy here.

All the values will likely greatly exceed 1.0 as 1.0 is utterly meaningless in scene referred data. It truly bears no special meaning.

The DPX shouldn’t be clipping if it was encoded correctly.

If there is any clipping, it would be via the display transform, if a transfer / tone curve wasn’t applied.

You can confirm this by swatching the data itself.

For anyone confused by this, I would heavily encourage a reading of Mr. Selan’s Cinematic Color paper. http://cinematiccolor.com

I’m afraid I have to disagree with the idea that “All the values will likely greatly exceed 1.0 as 1.0 is utterly meaningless in scene referred data,” based on empirical experience taken from real world experience. E.g., take some of the green screens I have had to key. The old VFX pipeline on this particular show (which I won’t name) went like this:

  1. Edit the show. This was done with prores proxies rendered with the look created on set for the footage.

  2. Break out the VFX shots and render out DPX files from the R3D for those shots in Redcine-X using a Redrocket card.

Here is the important part. The look applied during the shoot was embedded in the R3D and so when you render the R3D to DPX, even though the raw data contains values that don’t clip and contain the entire exposure range, when you use the look that editorial has been using, that the show’s producers have all seen all along and gotten used to and to which you must match the color to screen various cuts, the white point for the one-light transfer lies well below what the camera recorded and that new white point becomes 1023, 1023, 1023, pure white, in your 10 bit DPX file. Furthermore, all the mid level in the raw data are now way higher, so your green screens, while easy to key if you use the raw data in the R3D, are now all fantastically over exposed in your new clipped DPX files.

The values above 1.0 are hardly “meaningless.” They are critical. This is not theoretical. I have the DPX files to prove it. Was it wrong to do this way? Yes. But seven years ago, when we were one of the first Red One owners shooting a big TV show, and the software to handle R3D files was in its infancy, well, I don’t beat myself up too much about this. (and I still see post companies try to pass off DPX files rendered from R3D files with baked in looks established on set.) Not to mention the politics of getting production to change its workflow–sometimes that is just not an option and so post solutions need to be made to accommodate non-optimum production choices.

My solution now is to render the R3D files to .exr and maintain all those values that would be clipped at 1023 in the 10 bit DPX file. Those are the values that now go above 1.0, but can be color corrected back down, recovering all the data above 1.0 in the floating point .exr. In fact, it is easy.

You color correct down, pull your key, color correct down your FG plate, comp the two elements, and then reverse your color correction, all in float, and, BOOM! Comp looks good. I do this all the time.

I can’t do it without a file format agnostic color correction node that cannot be easily reversed…you know, like in all those other compositors who saw the need for this before, like Nuke, Shake, Combustion…

Bottom line: I need to give back to every production I work on (and there are many) exactly the same color I got from them. It is the most dummy-proof way to ensure that no color changes actually even happen in the VFX execution.

This statement must test TRUE.

RGB value coming in == RGB value going out.

Anything that happens between the two nodes that 1, load the media and 2, then color correct it, and the nodes at the end of the comp that 3, reverse the color correction, and 4, then save the comp media, is irrelevant.

You did read Mr. Selan’s document, correct? Referencing unity in a log image and attempting to draw a connection to the statement I made regarding scene referred linearization indicates that perhaps something was mistaken?

Anyways, this discussion has suitably drifted from the problem at hand. Please try the outline referenced previously in the thread.

I think I probably did misunderstand your point. I do have a lot of reading to get through from you, but in the mean time, I really can’t emphasize simplifying the issue and just “stealing” the way other compositing software has dealt with this issue in having a reversible color correction node that can set a new white point, black point and gamma.

I’ll step away for a while while a read all you have presented.

Here’s a quick follow up. After much reading from color scientists, it turns out I am guilty of the very, very common practice of loosely using the term, “Linearization.”

Fine. I stand corrected. I’ll never refer to Rec.709 as linear ever again. :wink:

If anyone reading this thread keeps that in mind, and embraces the spirit of what I was saying, you will find the need for a reversible color correction node is overdue for Blender’s compositor in a real world, VFX compositing pipeline.

From the VES paper “Cinematic Color From Your Monitor to the Big Screen”

“It is difficult to lump all of visual effects and animation into a single bucket, as each discipline has potentially differing color pipeline goals and constraints. For example, in visualeffects production one of the golden rules is that image regions absent visual effects should not be modified in any way. This places a constraint on color pipelines - that color conversions applied to the photography must be perfectly invertible.”