# Blackbody, Kelvin, sRGB, physical lights

So I recently understood that a light in Blender with a blackbody temperature of 6500 is pure white. This corresponds directly to the sRGB whitepoint of D65.

But from my experience in interior design, I know that a light temperature of 6500 would be very blue and cold looking. Typically we call lights that are around 3000-3500K “neutral white”.

How do these scales relate? Where does the difference come from?

I’m very interested in simulating the real world by using measured values and product specifications.

I think it’s the difference between color temperature and light bulb color temperature

Use a ies library file if you want to use light similar to the actual light

IES textures don’t contain color info, but they’re definitely useful for a more realistic look!

I read that wiki entry before, but I just realised they see 5000K as closest to white. Some searching the web shows D50 is a thing. I have more reading to do.

It all depends on your white point balance. 6500 K corresponds to white under a clear sky (which is blue).
In interiors where most light comes from artificial sources a different white balance might be needed or else 3500 K would look too orange.

The IES was supplied by the lighting manufacturer.
Of course, the color temperature value and output of the light are also displayed.

However, we only use the lighting patterns that are often used for convenience, and we organize them as shown in the image below.

The point of this thread is that the blackbody node uses D65 as the white point, but lighting manufacturers use something else (D50?, still not sure). So to simulate light color correctly, you can’t just do what you showed in your screenshot. I don’t care about light distribution - yet - so IES is irrelevant here.

I don’t know if in Interior design 3000-3500K is considered white. because if i saw a light with 3200K, it always looks Orange to my eyes. And whenever i buy a light bulb, anything over 6000K is labeled as “cool white/bluiesh white”. So i think the “colour” of a lamp is measured related to white point in photography, because…

As a photographer, 5200-5600 is considered as white because daylight film stock is calibrated to 5400K, and tungsten film stock is calibrated at around 3200K, that’s why i often set 5400/5600 as my white point in blender (using photographer addons) so that i could set my light as if i setting it in real life, with 6500K become blue-ish, and anything under 4000K become orange-ish.

AFAIK, BB temp uses D65 as whitepoint because it was hard-coded to rec.709/sRGB, that’s why if anybody used a different colour space with different white point (eg. ACES), it’s recommended to not use BB temp and/or Nishita sky (because those two is hard-coded).

I’m not sure why 5400/5600K is chosen as white point in photography (daylight) film stock, maybe our eyes have a white point too, and it’s closer to those value.

so IMO, i agree that it’s because the difference of the white point between our eyes (or what our eyes used to see), and the white point in the colour space that is used in the software.

1 Like

So now the question is: what value of whatever scale lamp manufacturers are using, corresponds to what value of D65 blackbody temperatur?

That’s because that range is considered pleasant, as we’re used to compare indoor lighting with evenings which typically consists of a fairly low sun. In order to make true 6500K lights “look natural” you’d also need tons of lux, around 1000 lux which is quite a bit above what you would use for a normal work surface. 5000K+ being referred to as “cool”, even if below 5500K-6500K is because it’s cooler than that low’ish sun we consider normal and pleasant.

3 Likes

Ah, that’s makes sense, building and interior lamps turned on the evening so our “reference white” at that time are the low altitude sun (around 3800-4200K), thanks!

I’m not sure if I understand your question correctly, the lamp temperature and the temperature in the BB temp must be on the same scale, because Kelvin is the unit of measurement for the BB temperature, the question is what are our reference white for looking at that scene.

We can’t say that we want our scene to looks like what our eyes see, because our eyes is great at adapting to different temperatures, but that’s not the case for photographic image or our 3D renders, because an image could only have 1 reference white, that’s why we set our white balance or film stock in the real camera.

IMO, as long as the relationship between different light temperature in the scene is preserved, whatever our reference white are didn’t really matter, because sometimes we use a “wrong” white reference for artistic choice.

I hope that makes sense.

Human color perception is a weird thing. What counts as “white” is essentially arbitrary (not entirely, but a lot more than you might think), and different white points are conventional for different use cases. That’s the difference you observe: what is your white point? In your case, it’s basically indoors vs. outdoors.

IRL, this is usually not a problem: there is typically a single obvious choice for the white point in any given context, and everyone naturally just picks that and “color corrects” in their head, entirely subconsciously. So people don’t tend to disagree about the color of the dress of the person standing next to them, but when you’re looking at images, all bets are off. This allows for a variety of fun optical illusions of the sort I’m sure you’ve seen: identical pixels appear to be different colors, gray strawberries appear to be red, and a whole bunch of others.

So from a reproducibility perspective, you need to use something like the CIE XYZ color model, which is independent of this, and instead quantifies how much each of the three kinds of cones in the eye are stimulated.

Saw your post on the Cycles thread and wanted to mention that Blender’s bb node doesn’t correspond to values lamp manufacturers use.

So in color spaces, the white point basically sets the scaling of your colors.

A full (closed domain) color space looks something like this in XYZ space:

When picking primaries, you typically do so in the slice of XYZ space where X + Y + Z = 1, meaning the length of your primary vectors (equivalently, the intensity of the corresponding light sources) is not yet defined:

At this point only the directions are known. You can freely choose the lengths of the vectors. All this changes is the maximum intensity of each of your lights. The colors (as in, the lamps’ emission spectra) stay the same.

By convention, the white point is normalized such that Y = 1, making this the Luminance axis. But after that choice, you can still pick how bluish or reddish the white point for this color space is by changing what all three primary vectors sum to (i.e. what happens if you maximally turn on all three light sources) in the X and Z directions.

For the completely arbitrary choices I made here, the white point ends up having the coordinates

``````X = 1.1034
Y = 1.0000  #  exact by construction as is convention
Z = 1.3448
``````

Which doesn’t correspond to anything meaningful I think, but just about any white point is possible.

The conventions are typically related to blackbody radiation, i.e. the spectra you get if you heat up an object to some specific temperature. D65, for instance, roughly corresponds to an object that has a surface temperature of 6500 Kelvin (6227°C / 11240°F) and spectrally it has quite a bit of blue.
The sun’s surface has about 5777K, a candle flame might be around 1500K

In this graph (note this was with Filmic) you can see a few reference temperatures marked, along with what they correspond to spectrally. The line on the leftmost edge is the `E` (equal energy) spectrum where our L/M/S cones get activated equally. If you look at the stripes of the frame, the point where the dark and light grey stripes meet for the first time (not counting the corners) is the regular blackbody radiation color. Towards the right is what happens if the colors get filtered more and more until they reveal, on the right edge, the color of the most dominant wavelength of the corresponding spectra.
Each vertical bar corresponds to a shift of 500 Kelvin.
As you can see here, the dominant color of 4000K is actually in the infrared, whereas the sun’s 5777K land in a turquoisish green, and 6500K is dominated by violet. (Though the exact hues would certainly shift when using AgX so don’t read too much into this. Beware the Notorious Six))

EDIT: here is an AgX version

When it comes to vision, the white point is very much dynamic and subject to change basically from moment to moment. Though I think it’s actually better to talk about the achromatic point which is just the white point scaled down from Luminance `Y = 1` to the quite a bit darker `X = Y = Z` plane. It basically defines what currently is perceived as “neutral grey” - a grey that’s neither reddish nor bluish. The color that you perceive to be zero saturation.

As our eyes and brains constantly adjust to what we see, filtering our perception down to what’s meaningfully different, this achromatic point changes too: In a room lit by candles, you’ll judge a piece of paper white the same as you would in an overcast sky or in broad daylight. Therefore there is no “one true white point”.
As just about everything else in vision (and really, in perception in general), everything is relative.

AFAIK there are limits to how far our eyes will adapt, but I’m not sure what those limits are.
Like, if you are in a room lit by teal laser light, you won’t forget that it is teal laser light you are looking at. The color will remain apparent. But there is a good range of colors (as defined by their spectra) that could be white to you (as defined by your perception).

This is one of many reasons why you have to be very careful when talking about color, to make absolutely clear what you mean. A physicist and a psychologist mean completely different things by that term. - Physicists don’t care about weird perceptual effects. They measure how much light of what kind is present and that’s it.
Psychologists must content with myriads of complicated, often still poorly understood filtering mechanisms before they arrive at something that they consider to be a color.
Unfortunately, when we attempt to make screens form images in our minds, we have to deal with both definitions.

1 Like

These articles will help. My thought is that render engines consider reflected light rather than emitted light. You may extend your studies to BRDF and how render engines works.

https://www.waveformlighting.com/home-residential/what-is-the-difference-between-cct-and-cri