Adobe RGB display device

Hello, I must be completely missing the point… ^^

So, I’ve read so much about color theory, color spaces, etc pp, up and down, multiple times. I’m not a pro (well, only a pro software developer), but I think I got it - I hope. What I do not understand is why there is no “Adobe RGB” display device available within the scene’s color management settings. I also didn’t find anyone asking this before - so there must be something I don’t understand. :slight_smile:
What I could do is setting the display device to “none” and do the color conversion in the compositor. But all I find information about is gamma correction etc pp, but no conversion from linear space to Adobe RGB.

However, there are two additional things that come into play: Windows’ color management ICC profiles, and the “use EDID information” setting in the graphic card’s (CCC) options. No, my screen (Dell 3008WFP) is not calibrated, but this doesn’t matter now as I’d like to understand everything first.

My sources of information, apart from Wikipedia, were

http://wiki.blender.org/index.php/Doc:2.6/Manual/Render/Post_Process/CM_And_Exposure
http://www.photozone.de/the-very-basics?PageSpeed=noscript
http://www.itp.uni-hannover.de/~zawischa/ITP/wirkungen.html
http://www.filmscanner.info/Farbtemperatur.html
http://www.brucelindbloom.com/

and many other more (that I do not remember). Many make contradictionary or inconsistent statements (even after reading them dozends of times) or don’t use precise wording, so it was a hard time to figure out what’s right or wrong in the past month.

I’m pretty sure I’m not the first one experiencing this. But my actual question is why there is no Adobe RGB display device available in Blender.

Thanks!

I’m not exactly clear on what you need.

I’m guessing you’re not looking to convert from linear to Adobe RGB by eye?

You can use custom LUTs in Blender. Generally Blender(and the internet) assumes sRGB. Obviously you can correctly display sRGB on a (correctly set up)wide gamut display.

There is more information on Blender’s colour management here:-
http://wiki.blender.org/index.php/User:Sobotka/Color_Management/Blender_CM_Integration
http://wiki.blender.org/index.php/User:Sobotka/Color_Management/Calibration_and_Profiling

If you use a wide gamut display you might be interested in this part: -
http://wiki.blender.org/index.php/User:Sobotka/Color_Management/Calibration_and_Profiling#Q:_But_I_have_a_wide_gamut_display.21_I.27ll_be_wasting_my_purchase.21

If you need more, it might be best to talk to Troy.

With any luck Troy is on his way…

Greets Willi. A little bird sent me this way (hello 3pointEdit) suggesting that I might be able to assist your query.

I’m rather flattered that folks suggested that I might be able to help you, but to be honest, the first poster here has come a long way and knows a pretty good deal about color and its management. There are a few other very knowledgeable peeps around these parts as well. Hopefully the ratio of artists that are well-informed about color continues to grow around these parts.

There is a huge amount of misinformation out there, so be wary of all “knowledge.” Hopefully with a little effort you can learn enough about the subject to find all of the errors and omissions in my response, and I encourage you to fact check everything.

Color is a vast topic. Much like any subject, if you aren’t quite having the cognitive snap with color, you likely haven’t had it explained using a model that you can understand. Either that, or the tremendous body of misinformation is contradicting what you are beginning to understand. Don’t fret. It’s common.

Hopefully I can explain why there is no AdobeRGB display device by the end of this post, and in a manner that you will understand.

Unlikely. Most of the time artists are too ashamed to admit they don’t understand color. At all. I sure as heck didn’t, and I certainly was worried as hell that I was the only one.

So welcome to the club of confusion known as Peeps with an Interest In Color.

This would be impossible as there currently is no color transformation node in the compositor. It desperately needs one.

And to dismiss an infuriating chunk of misinformation; “Linear” is not a color space.

Say that a few times.

Here it is again:
Linear is not a color space.

If you find anyone insisting it is, feel free to send them this way and hopefully we can get them sorted out.

Linear is an attribute of a color space, not a color space unto itself.

sRGB could be linear. AdobeRGB could be linear. ProPhoto could be linear. Rec. 709 could be linear.

Further, there are two types of linears: Display Linear and Scene Linear.

More on this later.

If you hope to understand everything, give up now. Color is very much a journey and you tend to grow with it as an artist. It cascades into all sorts of nuances involving alpha, file formats, and a plethora of other details.

Just try to learn a little bit at a time, and keep learning.

It should be made very clear that ICCs (aka the files standardized by the International Color Consortium) are a very particular breed of color management. They are not color managment in the canonized sense. There are many paths to color management, and ICCs have a very particular display referred graphic design approach.

Another approach developed at Sony Pictures Imageworks by Jeremy Selan is OpenColorIO (OCIO). It is the library that Blender uses, and is specifically geared for motion pictures, visual effects, and animation. Mr. Selan won an Academy Award in 2014 for it.

Let’s start with a very basic concept. Display or output referred imaging means that your values have a minimum and maximum setting.

If we pretend that we have three flashlights in our tricolored mixing system, we can easily see how they have a minimum and maximum setting. If they were of a particular white tint, we could figure out their color temperature and deliver a range of light from minimum / off, to maximum[1].

If our slider output light perfectly in line with a light meter test as for ratios of light, we would expect 50% of the light of the maximum setting when the slider was at the 0.5 position. If the slider abided by the radiometric ratios of light in the real world, we could call it a linear response. 25% would output a quarter of the light emission of maximum. 50% half of maximum. 75% would be three quarters of maximum. Etc. If it did not, and perhaps had a curved response as we slid the slider, we could say it had a nonlinear response.

If you and I were to try and create a tricolor system, we could agree that we need to go out and cut three different gel colors: red, green, and blue. You and I could go and snip out whatever we can find from transparent plastic colored bags or whatever, and tape it on the end of our flashlights.

Now our flashlights would be dimmable from black to full maximum red, green, and blue.

Critical point here is to note that no matter how dim nor how bright we make the flashlights, the color is exactly the same. The same color red. The same color green. The same color blue. The color of the primaries is constant no matter how intense nor how dim the setting.

Our next issue is that the colors we have selected for our gels are likely different. In color terms, the color primaries are different. If you and I both dial in our flashlights to 0.2 red, 0.5 green, and 0.7 blue on the sliders, no matter how hard we try, the resulting color will be radically different. The intensites of the flashlights might well be radically different as well, and the net sum is that there would be no way to have both of our homemade tricolor system show us identical color results.

The range of the colors the three color systems we both have would be different. They would have different gamuts.

This is precisely how the subpixels in your computer display work.

To draw back to our display model, the exact same principle holds; our displays are vastly different from one another. Both the color primaries of the subpixels that make up a pixel are different, and the intensity response of the values we send them will be different. In fact, even with identical display models, the primaries and responses will be different. This is why we have a need to pay keen attention to color management and likely profile our displays if we are doing color critical work[2].

Your display, the Dell 3008WFP does in fact have both an sRGB and AdobeRGB mode. It also has a wider gamut mode likely, that runs when you set it to native mode. In each instance, the primaries of each red, green, and blue subpixel are different, and the resulting output color of any set of values will be different, even if we send it identical values!

So the short answer to your question “Why is there no AdobeRGB display device selection” is largely because the developers haven’t included it. In reality, most displays on the market do not offer an AdobeRGB display mode setting either, which would adjust / emulate your display’s primaries to match the primaries of the AdobeRGB specification. This is why sRGB has been chosen as a default; it is a baseline reference standard.

The deeper answer is that Blender assumes sRGB primaries as the internal working reference model. That is, the color of the three lights are assumed to be according to the sRGB / 709 specification. Even more deeply, Blender has no real internal knowledge of primaries, and you would need to flex your color knowledge to be able to import an AdobeRGB image into the sRGB model.

If you would like to set an AdobeRGB output mode, this too is possible with OCIO, but likely a wee bit out of reach just yet for you. Further, even with the AdobeRGB output display setting configured, the images you send to it, if correctly transformed from sRGB, would look identical as those you viewed on an sRGB display! The working reference space, being sRGB, is smaller than AdobeRGB, and therefore cannot ever display the more saturated colors that AdobeRGB can.

Hopefully this extremely brief explanation has been informative and not confusing. There’s much more to this puzzle than I could hope to cover in a single post. Sometimes when you travel down the color rabbit hole, you lose a reference as to what is a simple concept and what is more complex.

If you are interested or have any questions, feel free to either ask them here in public or reach me via email.

With respect,
TJS

If you found any of this interesting, I’d encourage you to read the following document by Mr. Selan which covers some of the details regarding color and may be a great starting point foundation:
http://cinematiccolor.com/

[1] There are illumination sources such as tungsten that shift color as the voltage is dimmed. We aren’t concerned with this in the interest of brevity and clarity of concept.
[2] Calibration and profiling are two discrete processes. Referring to a calibrated display is different from referring to a profiled one.

2 Likes

That is such a cool explanation, thanks Troy!

I have a question. Playing with color management I see that I have a display device setting - this is what you have discussed? A transform for the linear color processing inside Blender? Does that only occur when linear is active? Does Blender assume gamma altered media first?

It seems that the ‘Render:’ settings (default/RRT/film etc.) affect the diplay not the rendered image?

Further, why doesn’t the sequencer provide a rec709 gamut? In the display area they are different.

Wow, a lot information. I will thoroughly read it. Beforehand, just wanna say thank you. I’ll be back… :wink:

EDIT: Actually… even before replying and asking questions, I have to read up some things - and get lost in an infinite circle again. ^^ So it will take some time (next to my daily work).

But I think what is not mentioned is that color space is only valid if the final viewing program supports it. So I could spend days tweaking colors only when I render to my output to have my color space entirely ignored by the player.

The best analogy I can think of is speakers. Let’s record a $100,000 cello in a perfect studio at 1 billion bits resoultion. Ok, now it’s time to mix down. Uh oh. I have to make a deliverable choice. So I choose MP3 and it finally get’s played on the beach coming out of some ten cent radio speaker. The result of the cello playing back on the beach is no where near the initial recording in the studio. Do we care, no. Well maybe a little. All we can do is try to preserve the quality as the data passes through our pipeline from source to deliverable.

So the same is true with color.

Further, why doesn’t the sequencer provide a rec709 gamut?

Itsn’t REC709 the default color space for players that can not detect a color space? That is why we can get a color shift when we render to H264 containers. The H264 file format, and others, does not support even a color space so it blindly interprets color based upon the REC709 specification.

Hmm I ask as I often send medoa to a tv pipeline and get clipped gamut in yhe shadow regions. Could be an issue with quicktime tho.

I’m definitely bookmarking this thread, Troy, thanks for taking the time to outline such good information on the infamous color management area.

Could be an issue with quicktime tho.

Quicktime H264 is certainly messed up. But I find that I can successfully deliver a “burned” in color space to a JPEG2000 encoded MOV file.

Largely yes, although I attempted to approach it on a lower level that made the explanation relevant for all color management knowledge.

The “linear” you reference is a bit of a misnomer. If you render an image, the data is scene linear, and can range from zero to infinity.

If you import an sRGB image, it would be linearized to display linear’s zero to (typically) one.

The two can coexist, but at their core they are radically different models and an artist awareness of the differences can be extremely useful and rewarding.

The default transform applies a proper sRGB transfer curve on the data, in the display referred domain of zero to one. This is only the default however, and other transforms can be applied with a little knowledge and understanding.

Does that only occur when linear is active? Does Blender assume gamma altered media first?

Generally, a media’s color space attributes are dependent on file format. Some software such as Blender may make assumptions.

Blender’s internal 32 bit float reference space is scene linear, zero to infinity.

Blender’s default byte buffers are assumed non-linear.

It is possible to bake a transform into the image using the View as Render option, and assuming a correct file format is chosen.

Not sure your question here, so I will ask you one back.

If Rec.709’s and sRGB’s primaries are identical, as well as their white points, are the color gamut volumes between the two the same? :wink:

Not necessarily. It can depend on resolution and other factors.

This is incorrect.

See http://en.wikipedia.org/w/index.php?title=H.264%2FMPEG-4_AVC for more information regarding color spaces and H264. Some wrappers also contain specific atoms for color space information.

Video encoding is another issue entirely. While it has color management aspects, it is a subject that requires specific attention and detail.

With respect,
TJS

Largely yes, although I attempted to approach it on a lower level that made the explanation relevant for all color management knowledge.

The “linear” you reference is a bit of a misnomer. If you render an image, the data is scene linear, and can range from zero to infinity.

If you import an sRGB image, it would be linearized to display linear’s zero to (typically) one.

The two can coexist, but at their core they are radically different models and an artist awareness of the differences can be extremely useful and rewarding.

The default transform applies a proper sRGB transfer curve on the data, in the display referred domain of zero to one. This is only the default however, and other transforms can be applied with a little knowledge and understanding.

Generally, a media’s color space attributes are dependent on file format. Some software such as Blender may make assumptions.

Blender’s internal 32 bit float reference space is scene linear, zero to infinity.

Blender’s default byte buffers are assumed non-linear.

It is possible to bake a transform into the image using the View as Render option, and assuming a correct file format is chosen.

Not sure your question here, so I will ask you one back.

If Rec.709’s and sRGB’s primaries are identical, as well as their white points, are the color gamut volumes between the two the same? :wink:

Not necessarily. It can depend on resolution and other factors.

This is incorrect.

See http://en.wikipedia.org/w/index.php?title=H.264%2FMPEG-4_AVC for more information regarding color spaces and H264. Some wrappers also contain specific atoms for color space information.

Video encoding is another issue entirely. While it has color management aspects, it is a subject that requires specific attention and detail.

With respect,
TJS

This is a great resource I will bookmark - read it all, but need to digest a bit :smiley:

Video encoding is another issue entirely

Video is what I was talking about, not just still image output.

REC709 and sRGB are not the same at all. It was invented in the 1990s and sRGB came many years later with the invention of digital photography. That us why the gamma shift occurred world wide. The new digital cameras used sRGB. Prior to that we were flatbed scanning photo prints.

It all comes back to the player.

I can take an sRGB image and display it on the screen with color space honored but as soon as I turn that into a video a color shift will occur because the sRGB color space can not be embedded into a video. The pixels are there, but all players do not know how to interpret them correctly so many players just fall back to REC709 as the color space.

My observation is that color correction is from image input to screen but it is not a full pipeline to output because different output containers preform differently.

This is all easy to test by conducting your own experiments.
Here is an example. The Bee in the upper part of the image was rendered with the H264 preset. A color shift has occurred. You can visually see this when compared to the original sRGB source image. There is no Blender setting that will fix this because the problem is in the deliverable container format.

Attachments


First, I should apologize to Willi for taking this thread into tangential territory. Sadly a little spurt of misinformation has appeared, and I am unable to leave it for fear it may confuse an artist trying to extend their color knowledge.

However the original poster’s question had nothing to do with this, and therefore I was attempting to keep the thread from being hijacked.

I have no idea why you appear so adversarial. Worse, you are more than willing to offer up absolutely incorrect information. This is problematic.

With specific regard to a color space’s gamut volume with respect to the RGB primaries, ITU-BT.REC.709 and sRGB are identical. White point is D65. Red, green, and blue primaries are identical.

The sole difference between REC.709 and sRGB is in their intensity curves. In other language, their transfer / tone response curves are different. This however, as per the three flashlight example offered above, only affects the intensity of the given color, much the same way that the color of a barn is no different given bright sunlight versus slightly dimmer sunlight passed through a neutral density filter.

I would hope that anyone that made this assertion would back it up with facts. To this end, let’s examine what Charles Poynton offers up for a 709 to absolute XYZ transform:

http://www.pasteall.org/pic/show.php?id=72883

If you look to the matrix in the middle of the page, you will see the transform into XYZ.

If you now compare the results to Bruce Lindbloom’s site, you will see that under sRGB D65 (http://www.brucelindbloom.com/index.html?Eqn_RGB_XYZ_Matrix.html) the values are:

0.4124564  0.3575761  0.1804375
0.2126729  0.7151522  0.0721750
0.0193339  0.1191920  0.9503041

Note how the values are identical to four decimal places at least?

This is because the primaries are identical, and assuming the primaries are accurate to within an equal degree of depth, the matrices to transform both D65 sRGB and D65 REC.709 are identical.

Another citation with additional decimal points:

0.412390799265959  0.357584339383878  0.180480788401834

0.21263900587151  0.715168678767756  0.0721923153607337

0.0193308187155918  0.119194779794626  0.950532152249661

Finally, let’s compare what the actual reference specifications have to say about the issue of primaries, or the color of each of the three color lights that produce the resulting color for sRGB and REC.709:

From http://www.color.org/chardata/rgb/srgb.xalter:

Chromaticity co-ordinates of primaries:
R: x=0.64, y=0.33, z=0.03;
G: x=0.30, y=0.60, z=0.10;
B: x=0.15, y=0.06, z=0.79.

Note: these are defined in ITU-R BT.709-3 (the Television Standard for HDTV).
It all comes back to the player.

Note how it states "defined in ITU-R BT.709-3.

and the screenshot from the ITU-R.BT.709:
http://www.pasteall.org/pic/show.php?id=72885

I encourage you to validate and double check these values with your own sources.

Now let’s consider the intensity / transfer / tone response curves.

They are different. REC.709 specifies an encoding curve that averages out to approximately to a gamma of 1.9. It is a two part curve. sRGB on the other hand, specifies an encoding curve that averages out to approximately a gamma of 2.2. It too is a two part curve.

If we take the entire gamut covered by the display referred REC.709 space and linearize it via an inversion of its transfer curve, and take sRGB and do the same, the two color spaces reference the identical color gamut volume.

This too, is sadly incorrect.

If one abides by broadcast scaling for encoding, the REC.709 data will have its luma (Y’) scaled to (in 8 bit levels) 16-235. Chroma (Cb’ / Cr’) will be encoded to (in 8 bit levels) 16-240.

Upon viewing however, a non-buggy player will counter-stretch those values back to their proper full range values. Ultimately this is a net difference of imaging bit depth which may increase the chance of posterization in an image. It will not however, produce any shift in hue[1].

Even with all of that said, if you transform the scaled RGB values into xyY, the chroma, or resultant color, maintains precisely identical xy coordinates. The intensity will appear different if you view the raw, broadcast encoded and scaled values, but the color itself suffers absolutely zero hue shift.

Different containers will specify different protocols. The color spaces and scaling issues are entirely tangential, and correct implementations will perform identically.

Perhaps you should have phrased your observation in the form of a question, instead of forwarding your incorrect inference as fact?

Apologies if this seems rather snotty and adversarial, but I cannot help but think you approached it equally.

If you would like to sort out the issues with your video encoding, I would be more than happy to assist you. Perhaps we can even discover something interesting along the way. I suspect the difference in intensity is due to either:

  • Observing the pre-scaled REC.709 transform.
  • A malformed encode.

With respect,
TJS

[1] Assuming that one’s eye is adjusted to the ITU recommendation for viewing under sRGB and REC.1886 for sRGB and REC.709 respectively.

Not necessary to apologozie. I’m happy about any opportunity to clear things up. I just started reading up everything once again from the start (color matching experiment, etc.) in order to at least be able to ask further questions. Thank you a lot for your time! It might take some time until I’ll come back due to the complexity of this subject.
BTW, here is another link that seems to be trustworthy for those interested in the matter: http://www.handprint.com/LS/CVS/color.html

Perhaps there’s a simpler answer as to why there is no Adobe RGB space in Blender…

It’s a propriety standard, right?

Blender tries hard to stay away from proprietary anything. That’s why it’s no longer possible to open Photoshop files. And since the biggest user of Adobe RGB is Photoshop, well…

It is impossible to claim a reference to a color encoding scheme as proprietary. At least currently ;).

They may lay claim to their own trademarks of name however.

From http://www.adobe.com/digitalimag/adobergb.html

Legal note regarding color-space naming: Only the Adobe RGB (1998) ICC profile created by Adobe Systems Incorporated can accurately be referred to as “Adobe RGB (1998).” ICC profiles created by other vendors, even if they conform to the color image encoding described in the Adobe RGB (1998) color image encoding document, cannot be referred to as “Adobe RGB (1998).” If vendors choose to create their own profile according to this specification, and they want to indicate to their customers that this profile was written in accordance with Adobe’s specification, then an alternate phrasing is required, such as “compatible with Adobe RGB (1998).”

Yeah, doesn’t look good when they start making rules about how people refer to a thing, though. Eventually, they’ll get around to claiming they have a patent on the sun and therefore we need to start paying royalties every time we perceive colour. :spin:

There are patents for specific commercial colours however. Should blender remove access to those? :stuck_out_tongue: