I’ve been trying to explain the importance of an OCIO node since OCIO was integrated, however here, this isn’t the proper solution. Using proper assets is.
Quite the opposite.
If you do this in an NLE or compositor, you’ve blown your energy ratios and all subsequent manipulations will fall apart; blurs will look wrong, overs wrong, etc. See below.
This is a rather large subject, and worth understanding, but likely for another thread.
Hacking around a proper workflow will yield pretty awful results. Ideally your materials are physically plausible materials that would work in any identical reference space, subject to whatever camera rendering transforms and aesthetic looks are applied. Bending the values to the output is extremely problematic.
Prudent is to simply have a fully pixel managed pipeline in Blender, but sadly it isn’t even on the map of current changes. I can explain this further, but perhaps reading the examples over at Cinematic Color would be a better solution, as it is also endorsed by the Visual Effects Society and offers good workflow practice. In particular, look at page 33 under Compositing.
Remember too that there was a time not too long ago where none of the developers were aware of pixel management. Sadly, a good number still sadly ignore it, or collectively don’t feel it is a core issue, despite it being the very basis of the entire media pipeline.
Part of the onus is on the pixel pushers to understand what is at stake and take the totality of the context into consideration; there’s no workarounds or easy solutions to complex problems. Simply pixel manage the entire pipe, and provide the UI interface elements as outlined in that other thread.
That comes at an added “cost” of “complexity”, as does sitting into the cockpit of a commercial jet versus a smaller plane, but it seems one that the community is ready to take that responsibility on and help educate the newer folks who might be confused by the dramatic differences.
It’s trying to fix something that has an impossible knot to untie. Start from proper assets properly transformed for use in the work. HDRIs, camera encoded “raw” linear files, log encoded footage, etc.
While this might seem depressing reading the issues, be thankful that all of you are now discussing it. This wasn’t the case about eighteen months ago.
It’s also up to everyone to understand that we are talking about rendering data of some form to a display. That means that no one-size-fits-all simple solution works as we can never know the particular context. It can only properly solved with a proper pixel management pipe. The other thread has more details for those interested.
While everyone is oohing and ahing at a procedural online viewport, it might be worth taking a step back and solving alpha issues and the resultant file encoding problems, as well as insisting developers properly pixel manage the entire pipeline, is something to get firm before moving any further forwards?