Bitmap Textures Color Management

I have a long list of questions I tried to figure out but I’m stuck at the moment and would appreciate some help.

I’ve been trying to bridge some gaps in my knowledge regarding bitmap textures, how they are saved and how best to use them in Blender in image texture node.

  1. My main question is regarding color management of textures.

Let’s take PBR workflow where we usually have Diffuse, Roughness, Metalness, Normal, Height maps. What color profiles would you use for each type of maps?

I assume it would be like this:

Diffuse - sRGB
Roughness, Metalness, Height - Gray Gamma 2.2
Normal - ?

In general the thought would be if the image has color then go with sRGB, if it’s grayscale go with Gray Gamma 2.2, is that right? Not sure how color manage Normal map since it’s more like data texture. Should it be sRGB as well? I mean externally?

  1. Next question regarding Roughness, Metalness, Height (grayscale) maps. We can save them as 8bit files, correct? As far as I understand there is no reason to save them as 24bit image since the info is the same in all 3 channels. Please let me know what you think.

  2. As far as I gathered currently it’s best to save textures as PNG. It’s lossless and have most features a bitmap texture could use. Also universally supported. I think the other option would be TIF? Or is there something else I should consider?

I know there is JPG but I consider it’s being outdated file format and would use it unless there were no other options.

  1. When working with Image Texture node there is an option for controlling color space. I understand that you should use correct setting in each case.

sRGB for sRGB color space.

Non-Color for Normal maps and other maps like Roughness which have only grayscale image. At this point I have a sub-question. What Non-Color actually do? Does it disregard image color profiles?

Linear. Do we use it exclusively with 32bit images? What is the difference with Linear ACES?

What is the use for other options? Raw, XYZ, Filmic Log? Can you suggest use cases? I tried to find this info in the Blender docs but it’s outdated there.

I’d love a good conversation around this topic and thank you in advance for your help.

The file format means almost nothing. BMP, PNG, JPG, whatever-- the file format says only that it’s clamped to 0,1 and has 3 or 4 channels, not what the data actually represents.

I’m not sure what you mean by “gray gamma 2.2”-- like raising to the 2.2 power? In nodes? No, you probably shouldn’t do that. Do you mean setting Blender’s gamma in render settings to 2.2? No, that only affects the render’s final transform, the 2.2 standard-ish gamma transform is already applied from the display device and view transform.

In general you should be controlling your color space used by your images by setting the color space in the image texture nodes using these images. Set to sRGB for diffuse color, non-color for roughness, metalness, height, and normal.

But there’s no reason you can’t bake some kind of transform into these images. Roughness, for example, very frequently has some kind of transform. Not necessarily an sRGB transform… You use your images how they were designed to be used. If you don’t know, you ask the person that made them. If you can’t find the person that made them, you start with some reasonable guesses and see if it looks okay.

8 bit is generally going to be sufficient for roughness, metalness, but for height, used as a bump map, 16 bits is often preferable.

.PNG and .TIFF are both fine image formats. .JPG will probably still be around after you’re dead.

Color spaces in Blender are crazily redundant, I personally think it’s a mess. Use sRGB or non-color. Or XYZ/filmic log if you have images made for that… How do you know? Ask the person that made the texture. Otherwise, just ignore everything else.

When Blender samples an image texture, it first loads the raw color value. If it is told that it is a sRGB color image, it runs that value through a function. If it is told that it is in XYZ color, it runs it through a different function. If it is non-color (or linear, or a few other options that all mean “no transform”) then it doesn’t run it through any function, it just uses the raw values it sees.

Many good answers. Thank you.

.PNG and .TIFF are both fine image formats.

It confirms what I thought and answers my 3rd question.

Color spaces in Blender are crazily redundant, I personally think it’s a mess. Use sRGB or non-color. Or XYZ/filmic log if you have images made for that… How do you know? Ask the person that made the texture. Otherwise, just ignore everything else.

When Blender samples an image texture, it first loads the raw color value. If it is told that it is a sRGB color image, it runs that value through a function. If it is told that it is in XYZ color, it runs it through a different function. If it is non-color (or linear, or a few other options that all mean “no transform”) then it doesn’t run it through any function, it just uses the raw values it sees.

Got it. Thanks for confirming what I suspected.

I’m not sure what you mean by “gray gamma 2.2”

Yeah, sorry. I wasn’t really clear here with my question. I’ll try to elaborate. I have several texture collections from different companies. Some are 4K, some are 8K and there are a couple of issues I see with these collection. One collection have 24bit PNG for Roughness , Metalness , Height maps. It seems redundant to have 2 same extra channels so I could convert them from 24bit RGB to 8bit Grayscale without any loss but save on drive space and probably make it easier on the render. Does it make sence?

The other issue is that some textures have color profiles some don’t. Some have sRGB, some Adobe RGB, for grayscale Don Gain 20%, Gray Gamma 2.2 etc. Some collections don’t have any color profiles. So it’s all over the place. I don’t know the best practices of bitmap texture creating so I’m trying to figure it out if there is a “golden standard” or at least best practice how these textures should be saved to occupied as much space as reasonably necessary, delivered and used seamlessly across applications/renders.

When I was talking about 8 bit, 16 bit, I was talking about bits per channel. A 24 bit texture is likely to be 8 bits per channel, with 3 channels, right? Yes, if you have a 3 channel roughness map, you could convert it to some kind of grayscale map and make it smaller, without losing any precision. Probably not worth the trouble unless you’re making a game; and if you’re making a game, it would be smarter to instead combine roughness/metalness/height into a single 3-channel image, which will let you sample all three values with a single texture lookup, faster than doing lookups on 3 different grayscale images.

Where are you seeing these color profiles? Photoshop? I’m not familiar with images with color profiles embedded. If you can, convert all to a single color profile like sRGB. Blender doesn’t support Adobe RGB input, for example (as far as I’m aware.) While I’m not familiar with all this stuff, be warned that it could just be an accident: textures intended to be used for color data in rendering are likely to be designed for sRGB transforms, regardless of what Photoshop says.

As far as “golden standard” the standard would probably be to use sRGB for color, linear (raw, non-color) data for, well, non-color data like normals, roughness, etc.

As far as ideal, ideal would be if we never had this non-stop headache that is gamma correction, and everything was stored as linear (non-color) data, including color itself. But that’s definitely not standard.

A 24 bit texture is likely to be 8 bits per channel, with 3 channels, right?

Yes, I was talking about 8 bit per channel. You’re right.

Probably not worth the trouble unless you’re making a game;

I’m not sure if it’s not worth the trouble. I was able to go down from ~25GB of uncompressed, unoptimized texture maps to ~8GB per collection without loosing anything important. I have enough drive space, but prefer not to waste it anyway and I intend to use render farms eventually. It should help to ease the loading process of the textures to the render farm as far as I understand.

Not sure how it’ll affect the render time locally. I read up on it and it doesn’t look like it matter much since render engines usually load images raw and resolution is more important in how much of RAM it occupies rather than gains in render speed. But still for cloud rendering it should help.

As far as “golden standard” the standard would probably be to use sRGB for color, linear (raw, non-color) data for, well, non-color data like normals, roughness, etc.

As far as ideal, ideal would be if we never had this non-stop headache that is gamma correction, and everything was stored as linear (non-color) data, including color itself. But that’s definitely not standard.

Yes, I get it. It would be easier but it is what it is right now. I’d love Blender to fully support ACES color workflow. Not sure if it does but I’m also looking into how I can build my workflow completely around ACES principles.

Anyway, thank for the help. I have my answers now.