Dealing with Aces , AGX, Srgb

First , lets talk about monitoring acescg in blender and how to save it. This is without changing the ocio config, which might be better depending on your workflow or how often you use Aces.

Blender 4.0

  1. First I would recommend you enable the GPU compositor in the compositor options (set to gpu next to execution mode) and for viewport set compositor to camera

  2. Set blender’s viewport transform to RAW (you will now be managing the output your self)

  3. In the compositor create two color space transforms nodes. The first will be set to linear rec 709 to acescg and the second will be acescg to srgb (this is your output transform, to see what you are doing)

  4. thats it you can now see how your aces will look in viewport when looking through the camera or grading in the compositor.

important - disable the acescg to srgb node before saving your Image and save to exr

issues –

  • you have no tonemapping or gammut mapping so youre getting a straight to srgb with clipping and all for monitoring.
  • if using the aces transform in davinci choose srgb texture output to see your image correctly ( in that plugin srgb has tonemapping built in so the image will be diff)
  • in Davinci you can aslo use the colorspace transform to go from acescg linear to your timeline (choose a good working space) then from your timeline to srgb (chossing gamma 2.2 will have slight diff)
  • many of blenders color grading nodes has issues with out of gammut colors and moddify your image even before you change a setting. Some are worse than others. Keep this in mind when bringing in wide gammut images that you need to send back out.
  • if sending in aces images from other apps make sure to save them in acescg or aces ap1 with linear transform (same diff)



Not perfect but might help someone in a pinch.

Next post I will tackle mixing AGX and SRGB in the same composite using similar conversions.

5 Likes

To composite with srgb elements outside of the AGX transform

  1. enable gpu composite in viewport and compositor
  2. set blender colormanagement to standard or filmic if you prefer

3.Items that you want to be under the agx transform just use colorspace convert from linear rec 709 to agx base srgb to srgb to linear

  • we go back to linear so we can do our compositing with other elements
  • In the picture below I convert into agx log and out because thats where I like to add my contrast looks and certain grading choices.
  1. You can use a contrast pivot or whatever you like for your contrast before converting to agx base.
    As I said I like to do mines between agx log. (now you have a contrast look you can adjust and also grade after the agx transform)

nodes you want influenced by agx keep the left of the converts and any other elements you add into the composite will not be

I will do some test with agx base srb to see how well it works under an agx view transform for srgb elements.

2 Likes

Hello, I have some questions, should the textures and HDRI be in sRGB for output in Aces?
What I see in the viewport with the node from aces to srgb is how it should be seen in the compositor (nuke)?

Great question. Will answer it later when I get home. I knew I left some things out.

1 Like

The point at which we are converting to aces is after the textures have been brought into blender’s working space, linear rec 709. So, we can tweak the whole scene after and before converting to aces. because we can see the conversion while we work we can tweak things that differ too much form our choices.

  • blender lacks a color aware color picker that helps you with presice choices
  • Aces has a white point of D60 compared to rec 709’s D65, so there will be a slight shift, but you can see this shift as you build your scene and add a grade node to shift it too your liking.
  • Some Things might require a small shift with the Hue correct node to maybe pull some reds or other colors where you want them as a whole , while others you can do that grade/shifting with a node right after the texture input. Most things won’t require much, if any at all, but others might need more attention. (keep an eye on your hue, exposure, and saturation)

So , lets say I create my scene and say The Red is overall too magenta. I will take a hue correct node and add it after the conversion to aces in the compositor and pull my reds where I want them. Then Maybe this shift fixed everything except a stop sign in my scene then I would go back to that stop sign’s texture and shift it (the reds) so it looks good.

This whole process works better when working on a whole scene than creating textures to save one by one .

If your trying to use this for baking out textures for a game or assests then going with an aces ocioconfig might be the better way, but you can also test too see if whatever program you are going to will convert your texture that you modified in blender to look correct after the conversion to aces in that program.

  • ex. I save my srgb texture outa blender that I modified to look good, then open texture in other program to see if it keeps my intent after converting to aces.
  • I would turn off any whitepoint conversion in other apps i’m going becauese my corrections were made with out one.

This all sound complicated and blender needs more colormanagement nodes.

So If your working on a scene in aces to add some balloons and trees to a video then great, but if you need to send out a bunch of assests, run some test to see if things work for the program they are going to.

Do a small test in nuke to see if things match up. They should unless the srgb output has tonemapping and other stuff going on. I will try and test this in nuke, but everything works in Davinci.

I mostly do video work so I need to run some test with assets in other programs. If I missed anything let me know

2 Likes

Tried opening render in Nuke. using nukes default color management will give you a hard time unless you render out to aces 2065 (AP0) because of no acescg (AP1) (if there is I didn’t see it). switch to ocio management in project settings. Make sure your using srgb texture for your output and that your viewer output is set to raw

I also tried baking textures out of blender and saving them as srgb. Their seem to be no issue aslong as your colormangement is setup right (using ocio aces default config). Use utility srgb texture for input transform on the read node, Then on the ocio color space node go from input scene linear aces to matte paint srgb texture) again make sure the viewers output is set to raw.

If I was working on a a scene in nuke I would have the ocio node close to the end of the graph since its managing my output. You can also have other ocio nodes if you need to juggle space between your graph , just make sure your always going back to your workingspace.


In the scene with the people the order from left to right is Baked image (srgb) , original, baked (viewport with aces to srg transform), baked (srgb to aces in nuke)

edit: if I was using the aces ocio config in blender I could probably just use the default srgb output in nuke’s viewer output , but have not tested.

1 Like

I have to say the more I try AgX, the more skeptical I am. I have a smart light bulb at home that can create some extremely saturated colors. When I look at the light from the light bulb directly with my own eyes, I see nice deep red color even in the highlight. Only when I take a picture in my phone does the phone tonemap the image so that the red fades into white around the highlight where the light bulb is near the wall. But once again when I raise my head from the phone screen to look at the reality, there’s no fade to white, it’s just really deep intense red color.

If the point of these types of color transforms is to bring rendered data as close to what eye would see in the real world on a limited dynamic range monitor, then AgX is even step back from filmic. AgX seems to go on way too aggressive crusade to defeat hue shifting around highlight that it gives up pretty much everything else for it. It’s certainly not better in terms of making computer imagery colors match real world closer.

This is the photo I took and how it looked with Pixel 6 image processing, the hue shift is there even on a photo. It’s kind of similar to how Filmic would handle it:


This is a quick image edit to reproduce what it looked like to my own eyes, there was really no hotspot:

And this is how AgX would make it look like:

To me it seems that both Filmic and AgX fail to represent the real world perception of saturated color. It’s just that they both fail at it from the opposite sides of the problem.

5 Likes

It’s called “Filmic” for a reason I guess… And AgX is also filmic, just the next version of it. It’s all about photographic film, not the eyes… It’s also about detail in the image and fitting real light into what a monitor can display. Also it lets you use wider range of colors… But the eyes also stop seeing color at some brightness, your lightbulb is just not bright enough. I also don’t think you should use it only as is - you can adjust the colors if you don’t like them, it’s just a tool to make it possible to see intense colors and gradients between them in a more natural way.

The image you show does not look like it used AgiX transform. How did you make it?

Oh and by the way, cameras don’t necessarily deal with color correctly.

1 Like

It was just quick image editing app paintover to show how it would roughly look like. My general point is that AgX just goes way too far in desaturation. While filmic had problem of color shifting in saturated highlights, AgX has a problem of impossibility to achieve saturated highlights (I am saying saturated, not oversaturated). When using AgX, I am just choosing a different problem to fight, instead of solving a problem.

1 Like

It would be interesting to try actually applying AgX to a photograph like that. I think the gradient would be way nicer, than what you show.

My personal opinion is that if you concentrate on the saturation of highlights, they might seem like a problem especially if you don’t do enything to adjust them to your liking, but the whole picture looks more natural. To me at least. And you can always edit it and adjust the colors and make them more saturated, but in that case, you simply have more detail in gradients with AgX. So, more control. Personally I like it. Obviously, nobody has to use it if they prefer something else. Problems with ACES seem way more random and inconsistent to me personally, but I haven’t experienced problems with Filmic myself, I just never have images saturated enough for that to happen. People around me(interiour designers) don’t seem to like vivid colors at all. AgX works great for that anyway.

I haven’t taken raw photo unfortunately with it. I will check if Pixel 6 can do it. Either way, to me, simply not being able to achieve any degree of saturation in the highlights is more of a dealbreaker than hue shifts would ever be. Luckily, Filmic is still available in 4.0, and ACES is there as well, so it’s not that big of a deal.

What about the actual detail? Have you considered, you can have visible detail with AgX where other ways it just disapears and blends to single color and tone?

For me, what’s relevant is that I expect the color transform I use to be off, since the average monitor doesn’t have nearly the dynamic range the human eye can perceive, but I just don’t expect to be that far off from what my own eyes see. I don’t need some superhuman recognition of detail in the highlights a tone mapper would provide me with. I just want my computer monitor to be as close to a hypothetical portal into alternate reality as it can be, so the better job color transform/tonemapper can do to make that happen, the happier I am.

With AgX, even a 100,0,0 super red will fade to white:


Now the Tapo smart lightbulb I have can get very saturated, but probably not as red as light can possibly get in real world, so it’s probably more like 100,1,1, if we consider something like pure red laser to compare with. And that would come out like this:

I guess the point I am trying to make is that even if unrealistically red color (extremely narrow band of specific wavelength) fades to white so abruptly with AgX, it just seems AgX is throwing away too much of saturation information.

I think the part of the problem is that AgX was developed on examples like the lightsaber one, where completely 100% saturated colors were used, but then they were compared to movie shots, where real light sources were used. But those real light sources had hardly as perfectly narrow wavelength to reproduce completely pure super saturated color.

4 Likes

AgX just looks nicer on a real image. It’s not often that I happen to need to render a red light shining on a plane alone. More detail can be visible, mor color, more even gradients, the images just look fuller, more natural. …and, yes, less saturated. Not a problem for many people.

1 Like

Sure, I am not denying that at all.

Unfortunately, it is way more difficult to work with in my experience. It is even more unfortunate that everyone I asked how they get specific looks for it, told me they were using other software for compositing… So far, literally everyone (I am sure there are others of course).
And the more people who start to use AgX, the more questions about it appear in this forum by confused users who don’t understand why this and that happens.

3 Likes

I think AgX is a very reasonable choice for a color tranform. Still, I think it is a choice and it’s a matter of opinion, what kind of tone mapping one preffers. And it also depends on the circumstances a lot - it has no advantages for most non-photo-realistic renders for example. So I guess AgX will not work for absolutely everyone. Same goes for Filmic or any other way to manage colors by the way. But what else should you use as default? ACES has problems, Filmic has problems, “Standard” sRGB transform is no good at all… What else is better? AgX improves the situation a lot with some very serious problems and has many advantages. It doesn’t mean it is going to be the perfect solution for absolutely any case. With OpenColorIO in Blender you have plenty of flexibility for other options if you need something else. And 4.0 has some improvements in color management in general, AgX is not the only change.

Confused new users… Well… What you gonna do? Nothing can solve that problem. Especially in the area of color management. This is no different than any other thing about 3D. New users are confussed about pretty much everything.

Edit: actually no, not only new users. There are proffesionals working in the industry for decades who are confused about color management as well. Color management is confussing, mostly broken everywhere around. It’s a difficult thing. I think it’s natural, people are going to be confused about some things about it in Blender as well.

Anyway… how exactly is it difficult to work with in your workflow?

1 Like

I pick sRGB colors for objects and I am getting weird results with unpredictable hue shifts, depending on how bright the scene is.
I would like to make things pop a little bit more, which is very difficult to control. Because as soon as I start to change the lighting, the hue shifts are affected again.

My solution is to go back to Filmic, which gives predictable results, even though it clearly has several weaknesses.

It doesn’t matter who uses it. The default should give somewhat predictable results, a good starting point.
If that’s not possible and additional complexity is needed, I have no problem with that, as long as there are ways to overcome those like helpers in the compositor or whatever. But with AgX we get a high-end tool without helpers to make it easier to control.

The cases where those weaknesses of Filmic show clearly are not that common. I think that’s a viable option.

Well, that’s a flawed workflow. You cannot really do that in scene referred workflow. You cannot pick an sRGB color and expect it to match what you render. There are multiple levels to why not. Firstly, Blender uses linear colors to render, sRGB values will not match linear values, then you have scene lighting that changes colors, then you have multiple components in most shaders, then you have color management, whatever it is, Filmic or AgX, you would have to use “Standard” if other reasons did not apply. Well, anyway, they all do apply. It’s a flawed workflow. It wouldn’t be reasonable to expect devs to adjust defaults considering wrong ways to work… You could adjust colors judging visually. You should match your scene’s lighting to your actual viewing conditions if it’s a real thing you are trying to match, or to the conditions the reference photographs are taken in. The color you judge should be exposed correctly in the render tests and in the reference. There will be no hue shifts then and since you judge the complete shader, you will automatically take all it’s components into account, not just albedo as well as all other things that happen during rendering, since you should be looking at your reference and render tests. AgX is going to do what it does after that. This is what is meant to happen, the hues are supposed to shift. If you don’t accept that, then it’s certainly not going to work for you…

No, I don’t expect it to match. My expectation when I for instance pick some green color which is somewhere on the sRGB triangle between white, green, black, that the resulting color I see on the screen is on that triangle or at least very close to it.
When I use Filmic and do that and blast that area with a lot of (white) light, it essentially multiplies the sRGB value by a certain amount, leading to color in Linear Rec.Something colorspace, which is still on the same plane as the original triangle from which I picked the sRGB color.
What Filmic does is to bring the color back to sRGB and guess what. It ends up within that triangle again.
So, I pick an sRGB color, fancy stuff happens, the outcome is predictable.

Why is that a flawed workflow? Why is that wrong?

Great, do we have the tools in Blender to do that? I couldn’t get it to work even closely to preserve a flexible workflow.
Whenever I make changes to the scene lighting, there are hue shifts and every single color reacts differently to it.

If I have to replicate something from a reference, this workflow would make sense. But I am not.

That’s sort of the main critique I have about AgX. I am sure there are quite a few people who replicate realistic scenes in Blender and I certainly see the benefits of AgX for this kind of work compared to Filmic.
But last time I checked, there are many other use cases for Blender. And there are many people who use similar workflows like me. People who work with very Low Poly objects or people who create infographics. Besides that, there are people who work on games and model/texture in Blender. Even though game engines also start to support more extensive color management, the default is still the predictable workflow I explained earlier.

I understand how realistic rendering works, I understand how color transforms work, like in AgX, I understand that hue shifts are expected and wanted depending on the transform.
And yet, there are a ton of workflows in 3D where this is not necessary or rather where it is distracting or irritating.

1 Like

There’s a difference in having a workflow for a particular color space, and then expecting that workflow to give the same pleasing (or better) result in every color space.

That’s the "flaw’.

Raw photos look terrible in Photoshop, until you color correct them. They aren’t meant to look nice straight out of the camera. If that’s the goal, then one uses the JPG instead… yes, the JPG has many inherent flaws and cannot be manipulated as the Raw can - thus, one doesn’t try to just apply the same process to one as the other.