File a bug on Blender proper; the configuration fails the ociocheck tool and sanity checks via the API. This is due to the double up of the XYZ role and the xyz transform name. Easy to validate and prove with the ociocheck utility, and this is why it is breaking in AE.
Transfer functions aren’t all that is involved here. There is a gamut map as well, and that is part of the complete OCIO chain as provided. A TIFF saved at 16 bit integer under the Filmic Base Log Encoding will have the exact same dynamic range, with minor intensity compression. It has the gamut map baked in.
@AlexRWeinberg First off, awesome video! Thanks for making this understandable.
Second off, I do have a question about specific setting in Blender under Color Management tab.
My setting looks like this:
1)Display Device - RGB
2)View Transform - Filmic
3)Look - None
4)Sequencer - sRGB
Right or wrong?
I also have Exposure set to -6 probably because of one the addons, not sure which, but I blame Starlight.
Everything work just fine but I do have to bump White Point in Extractor to about 50-60 (30-ish if render comes from Eevee not Cycles).
There is sadly quite a bit of confusion on this front, so I’ll try to address each of your settings in order. Doesn’t hurt to have it around in more places.
This setting relates to the colourimetry of your physical hardware display. If you have an sRGB-like display, “sRGB” is the proper setting. If you are on any piece of Apple display device, it is the wrong piece of hardware. Default Blender does not include the proper transforms for all of Apple’s hardware, with the exception being the MacBook Air, which is an sRGB display.
This is the transformation that is applied to the radiometric working data. Filmic will perform a gamut squashing on radiometric renders. There’s also a “Standard” which represents the display’s hardware response, gamut clipping etc. Blender’s default configuration is not the cleanest and easiest configuration to understand.
The Look will apply creative warpings of your data. These creative warpings will render differently depending on which View you have selected for your appropriate hardware Display device. In Filmic’s case, the Looks listed are designed as a sort of compliment to each other, so the various contrasts are designed to work in tandem with the Filmic View transform.
Note that Blender has a peculiarity in that “None” look corresponds to the Base Contrast look under Filmic. You shouldn’t see any difference between selecting “None” and “Base Contrast”, but there is indeed a creative look applied when selecting “None” here.
The VSE in Blender lives in its own little bubble, and as such, the way it handles pixels is a complete sloppy and broken mess. The above setting chooses how the VSE works with pixels, but it is not directly analogous to proper transforms etc. outlined by the traditional OpenColorIO pipeline. This is a side bar, just let it be noted that it doesn’t handle pixels at all properly, and everyone would be helluva wise to stay way the hell clear of the VSE until if / when it is ever fixed.
This looks “Right” for rendering things, context depending.
This looks like a very strange value, but indeed addons can contribute.
Not sure what White Point represents here, nor the value scale, but this too feels strange. Might be worth further digging. Under an sRGB-like piece of hardware, when R=G=B the output colour is a very specific idealized mixture that is close to a daylight D65 colour. This will appear achromatic once adapted.
Plugins that muck with white point frequently are problematic, as they typically can mishandle the adaptation etc. It’s a good policy to be somewhat skeptical about things like this.
Depends on context. It cannot be stressed enough that mixing between the two radically different internal designs within Adobe products versus Blender can lead to all sorts of headaches for image makers. Solving those issues can be extremely challenging.
Thanks a lot!
Now I am properly confused, haha! I guess I will go with the flow, as images look OK after tweaking in AE. Thankfully I am working with only one editor and I think we can managed this., but I agree it’s not ideal.
I am mostly concerned about linear workflow as shown in the video above. Everything works with settings I’ve mentiond above, but I need to knock exposure down (after Extractor but before OCIO).
Does VSE setting impact rendered information in openexr in any way? Or is it only internal to Blender?
I’m still confused as what it does, really.
None the less, thank you very much for amazing answer!
This is part of that “completely different internal models” idea above.
Adobe products began life and continue to be display referred manipulation tools. They manipulate code values in relation to an output device.
Path tracing software and in turn things like Blender, Nuke, Fusion, etc. all provide mechanisms to manipulate actual radiometric values.
The idea of “linear workflows” is utterly confused by poor folks as a result. There are in fact two types of linear; one is radiometrically linear with respect to an infinite range of values, and one is radiometrically linear with respect to a display.
Adobe products can sort of be coaxed to behave in the latter form, but you will find flakey results. It is more or less impossible to have an Adobe product handle things in the former case, as it simply has too much tech debt and internal legacy assumptions.
It has a specific impact on the state of the data within the VSE context. If you stay away from the VSE, you are fine.
In terms of data encodings, there’s a hellish series of broken and ropey ideas in Blender. General rules are:
All 8 bit encodings will bake in the Display, View, and Look transforms when selected as such. This is fundamentally problematic in contexts where your display differs from the desired encoding you are attempting to achieve.
All 8 bit encodings will be unassociated alpha. This is fundamentally problematic.
EXRs at half and full float will be dumped from the render buffer out, with no transforms applied, and with associated alpha. This is problematic to Adobe software, as the code base is completely bungled with respect to alpha. Also, given the EXR is dumped without any transformed data, the entire Filmic stack must be applied separately as per videos and tutorials such as the one above.
Dumping out all of the places where one software is going to bungle you in another would make for a nasty long bungled thread. This is why knowing a little about specific contexts is better than guesswork. There are just too many variables to catch you out!
Thank you so much for putting it all together helped me a ton!
Despite the fact that it did not work completely as expected - the .exr was still the wrong colour and saturation… your tutorial gave me a solid direction. After Effect 2020 Blender 2.90.1 OS 10.15.6
After digging around for a while I found Natron (free open-source compositor) and with it comes a bunch of configs like with Blender. Natron > Contents > Resources > OpenColorIO-Configs > nuke-default > config.isio
Input Space: Linear
Output Space: CLog
After Effects > Project Settings > Liniarize Working Space = Off
These settings produced results closest to Blender Cycles Filmic Render.
Well, I understand that this is the idea behind the plugin, and there are just literally three clicks - not much to get wrong.
But somehow the images don’t match for me in After Effects
P.S. I was able to copy the content of the folder from Natron > Contents > Resources > OpenColorIO-Configs and paste it in as /Library/Application Support/OpenColorIO - now the ICIO Configurations will have a drop down with all those options.