Figuring out material color management ONCE AND FOR ALL

This doesn’t work.

You should loop back and cover some of the foundational concepts. It doesn’t work under a scene referred model, it confuses the issue, it doesn’t work for alternative colour spaces, etc. It is simply a mess. Avoid it.

As a general rule, when folks use the term “gamma”, it is pretty safe to assume they are quite confused and muddling up concepts.

You should be able to use a Filmic contrast instead of the power function for contrast. That should give a more film emulsion falloff. All of the Filmic contrasts are designed for viewing out of a 2.2 power function display and should, in most instances, deliver a smoother ramping up out of shadows and into highlights.

doesn’t blender bring all other other colorspaces into rgb linear where all these operations take place .

For example If I had a tiff image in L.A.B and brought it into blender with a ocio profile; blender then makes sure the image is in rgb with linear transfer for its operation then uses the output profile to convert the image for saving and display. either way the operation happen in rgb? (trying to understand not trolling)

Best for an alternate thread. A mod can move relevant posts.

with a proper ocio config file would’t the a lab or prophoto image be converted properly (absolute or relative) to blenders rgb model ? (I thinks it linear srgb, not sure)

Best for an alternate thread to avoid cluttering the main issue here.

I have to say, this is a sorta depressing answer to me, haha.

Technically, I think my question has been answered (if I check it as answered, will the thread lock?). I have a deeper understanding of what’s going on, but philosophically, from a creator’s perspective, it’s a very frustrating answer.

The problem is that I’m working in sRGB and I’m having results that are based off “hidden” data that visually doesn’t exist. Any artist will tell you that dealing with the technical aspect of tools is a death toll, and I now have to expect operations won’t work the way I think they should. I’m now having to add switching color modes to my entire material workflow in order to see if there is “hidden” information.

In the grunge example earlier, I can see how the process is working accurately, but an accurate process doesn’t give me a “correct” result. The bump map is still influenced by information invisible to the render, and it still needs to be corrected via gamma/contrast/whatever in order for the material to look correct.

The photoshop solution seems to me more correct, if less accurate.

I’m with you buddy, haha.

In the grunge example earlier, I can see how the process is working accurately, but an accurate process doesn’t give me a “correct” result. The bump map is still influenced by information invisible to the render

This entire thread you have never provided an example or even a detailed explanation of what this incorrect bump mapping is. Could you do so? I’m really curious what the root issue was. You started the thread with some confusion about what you were seeing with the viewport, but I get the impression you were already troubleshooting something else when you found that. What was it?

Ya, I apologize because I can’t show the specific materials that I’ve been running into issues on, they’re property of my work. Which is why I had to invent the grunge example earlier. Let me try to clarify that file. hi res jpeg incoming:

I understand why this is working now… switching off color management shows me that the actual color data actually does have grunge that continues that far down. But that information isn’t in the sRGB render, and so the linear bump information needs to be fixed to match the visual sRGB image for the render to look correct.

Sorry if that’s not what you were asking for! I don’t mean to beat a dead horse, haha.

1 Like

Ok, yeah, in this case adding a ^2.2 operation right before the bump node does help. Although without it, it actually looks just as wrong (IMO) even without the view transform on the color channels. But yeah, this is one of those “if it looks right, it is right” cases.

You’ll probably find this to be a more straightforward way of adjusting it, though:

1 Like

Your issue is a misunderstanding as to what you are looking at. In essence, it is entirely about pixel management.

I have a BFa. I can assure you, folks tossing “artist” around with little care and training is more of a problem than learning about the technical details of a craft. Do you think a sculptor cares about the technical details of their clay or marble, or the technical details of their tools? Yes, yes they do.

I avoided chiming in as the technical answers by J are on point. With that said, what is hurting you, is your knowledge. I would encourage you to step back and learn about the craft of pixels and in turn, the rest will become very clear.

You are obsessing about sRGB, which is part of the problem. A bump map is data. It is not a colour. How you visualize data is another issue. Finally, Photoshop is another huge issue.

I am willing to try and walk you through your knowledge gaps, but you will need to invest in some simple questions first.

You seem confused about how data and colours differ?

Hi,i think you overcomplicate this stuff.its fairly easy.look at the most PBR shader setups,and how the colordata is set in the image texture inputs,for the different datas (albedo,roughness,normal map,bump map)

albedo should be as colordata,because blender take then the color, make it liniear for the renderengine (liniear workflow)automaticly,you dont need a gamma correction for this,blender does it for you.

all other image texture datas you load,as said roughness,normal map,bump map are datas witch are stored allready as liniear datas,and thats the reason you should set the image texture to non color data.now blender “knows” it is linear data and dont need a gamma correction. simple as that.

here you can read the blender doc,its descriped even better there
https://docs.blender.org/manual/en/dev/render/post_process/color_management.html

edit,sometimes the roughness datas or bump datas are used different in different renderengines.
especially, the non color roughness output ,need sometimes to be squared for example.(math node with power ² ) before plug into the rouhgness input from the shader.
this depents of course how the material datas was stored and forseen for the given renderengine.

After running a bunch of middle gray test I started to see the problems. Thanks for the feedback

I’m going to mark this thread solved, as I think my questions have been answered. I appreciate the work you guys have put in to help me figure this stuff out!

@troy_s I certainly think I have a basic understanding of what’s going on now, so I’m not sure I have many massive knowledge gaps here, I just need to dig in and relearn how to practically use some of these things. My remaining frustration is only at having to learn different workflows, or expecting different operation results, for different softwares. I’ve been working professionally with Adobe products for many years… their solution has never failed me before. please don’t ridicule that frustration!

@pixelgrip righto, I think that’s a pretty clear practical explanation of how color and data interact. I definitely didn’t understand that the distinction is between color and data, since they’re both visual I treated them the same. oh what a fool I was!

Just saying that the more you learn, the more you will learn how busted up Photoshop is when working in the scene referred domain. It is a large mental leap. Don’t underestimate it.

It really is part of the problem of understanding.

I know that stating this has no benefit for this thread, but I highly second troy_s on the remark that it is a large mental leap. I read professional colorist forums quite frequently and while at first it amazed me how little people with decades of experience actually understand (not just do what they always do and it works) about color management I’m not at all amazed any more. It is more of a norm than exception that people mess up even the most basic concepts due to either software that presents things in twisted manner (a’la Photoshop) or historical understandings that have lost their ground in new workflows. Just take a look at any ACES thread and you see massive confusion arising.

1 Like