When to apply gamma correction?

Hey everyone

I’m trying to learn about linear workflow to improve my renders, only I’m uncertain as to when and where the gamma correction part actually needs to be implemented.

I’ve read through the wealth of information on Yves Poissant website and searched through the corresponding posts on this forum and have learnt alot of things I knew nothing of before, so thanks alot for such in depth explanations of the process.

Unfortunately there are some areas I still can’t quite grasp. :frowning:

For instance… if you are only using procedural textures and shaders (which obviously haven’t been photographed) in a scene, does this still require you to use a 0.45 gamma correction node in the compositor when working on your final image?

At the moment my workflow is to set up my materials/shaders and lighting so everything appears roughly how I want when I render. If I then turn on the compositor and use a 0.45 gamma correction everything is going to appear washed out. Should I be setting up my initial render with darker materials and lighting to compensate for the gamma correction?

Also, if I have produced some ‘full render’ baked textures just from a material, will this texture require altering in photoshop/gimp the same way that a photo taken with a camera needs to be?

Sorry for the naive questions

All textures (including procedural ones) going into renderer should (be tone corrected where needed to) have a gamma of 1.0. Then you apply a 2.2 (0.45 in the gamma node) gamma correction only to the final output.

Only if the texture is going to be used for a game (depending on the engine you use).

I’ve had a few discussions about gamma correction lately and if one thing is clear it is that this whole gamma correction is not clear to a lot of people so don’t feel bad.

For instance… if you are only using procedural textures and shaders (which obviously haven’t been photographed) in a scene, does this still require you to use a 0.45 gamma correction node in the compositor when working on your final image?

Yes. The reason for applying gamma correction to the final render is to correct lighting issues. Lighting issues that are corrected by the gamma correction are light attenuation with distance, light falloff at terminators and light and shadows superpositions. Simply think about the renderer as a virtual camera. By applying a gamma correction to your render, you are just replicating what digital camera do with the photos. Digital camera gamma correct their photos so you do the same thing. The gamma correction is, indeed, 0.45. Not 2.2.

Of course, since you apply a gamma correction to your final render, this means that all your textures must be gamma compensated as a consequence. In this case, the gamma compensation is 2.2. But you don’t just need to do that as a consequence. It happens that all colors that you see on your computer monitor are gamma corrected too. So, you should revert that gamma correction before using them in a renderer. Personally, I have setup an Excel spreadsheet where I can enter RGB values and get the corresponding gamma corrected values. That is a real pain to need to do that though.

The basic principle playing here is that the computer monitor is setup so a gamma corrected photo looks good on it. That means that if the colors you see in a photo look good, then any other colors that look good on that same monitor must be considered as being gamma corrected too because they live in the same color space (the monitor color space) as the gamma corrected photo. It does not matter if the color is part of a photo, a color selection dialog or a procedural texture.

But the reverse gamma corection on the textures and colors have another very important consequence when you are using rendering techniques such as radiosity or GI. When doing the GI calculations, all textures and colors are taken to mean reflectances. If you do not reverse gamma correct your textures and colors, then the GI render will look way too bright because the reflectances are all way too high and thus a lot more light is bouncing around than it should.

At the moment my workflow is to set up my materials/shaders and lighting so everything appears roughly how I want when I render. If I then turn on the compositor and use a 0.45 gamma correction everything is going to appear washed out. Should I be setting up my initial render with darker materials and lighting to compensate for the gamma correction?

Unfortunately, yes. That is what I do in Blender and frankly, I hate it. Until a linear workflow is integral part of Blender, you will need to work that way or forget about linear workflow altogether. It should not be the artist’s job to take care of all that stuff and reverse gamma correct the textures and procedural colors, etc. It should be the application that does the reverse gamma correction on the fly whenever a hypothetical “Linear Workflow” would be selected by the user. To be fair, Blender is not the only 3D application that cannot handle linear workflow properly. But that is quickly changing.

Also, if I have produced some ‘full render’ baked textures just from a material, will this texture require altering in photoshop/gimp the same way that a photo taken with a camera needs to be?

No. If you haven`t applied a gamma correction to the render, then you don’t need to reverse that gamma correction.

Thank you so much ypoissant for answering my specific questions with such detail. It’s much appreciated. I think I’ve almost got my head around the basic principles of gamma correction now. I shall keep re-reading in the hope it sinks in thoroughly.

With regard to correcting procedural colours… would it work if I applied a 0.45 gamma correction in the compositor before choosing the colour? Essentially working backwards I guess. If not, I’d be interested to know how you set up an Excel spreadsheet to give you gamma corrected values. Maybe I’m asking too much though.

Unfortunately, saving 8bit/channel images in linear gamma kills quality, they end up looking like crap when converted back to display gamma. That’s the whole reason why colors for displaying are not stored in linear gamma, the human eye just does not perceive brightness linearly, but roughly according to the gamma curve, so quantizing with only 8 bits has to respect that.

Unless the renderer has internal support for converting integer images back to linear gamma, the only way to do so without quality problems is to use float image formats in linear gamma, e.g. HDR or EXR images. But when using them for material reflectance etc. you should be careful to not have a wrong scaling where values go higher than 1.0.

Oh and yes, converting every color setting by hand to linear gamma to have correct colors after setting gamma to 2.2 after rendering is definitely a pain too…another thing the renderer actually should handle for you.

Can’t tell you much about compisiting though…

I had to check if this could be done with pynodes — two minutes later I had this… :wink:

from Blender import Node 
 
class GammaNode(Node.Scripted): 
    def __init__(self, sockets): 
        col = Node.Socket("Color", val = 4*[1.0]) 
        gamma = Node.Socket("Gamma", val = 2.2, min = 0.0, max = 10) 
        sockets.input = [col, gamma] 
        sockets.output = [col] 
 
    def __call__(self): 
        self.output.Color = map(lambda x: x ** self.input.Gamma, self.input.Color) 
 
__node__ = GammaNode

I’ve never used pynodes, so I’m not sure how to use your script N30N. If you could give a little explanation it would be really useful.

Thanks alot

It’s ok. I think I have it working now.

Thanks again for the script N30N

EDIT: However from reading a previous thread on this forum started by Orinoco I suspect that this may not be a good solution as it affects the shading aswell which is undesired. I may be wrong though.

Attachments


Perhaps I should have set it up how you can see in the example on the bottom?

Attachments


Thanks to Yves Poissant and all the other experts contributing to these discussions for bringing the attention of the community to this issue, you thus raise the quality of our output. This thread is about to clear (I think) the last obscure point for me to switch to a linear workfow:
To me, it would (now) seem logical to use the bottom set of nodes in the above picture if you just picked colors looking at a non gamma corrected output or the infterface but now that we suppose to have a gamma node of 0.45 applied by the compositor nodes picking your colors by looking at the final render through trial and error sould be enough , right?

thanks yves! wiki updated http://wiki.blender.org/index.php/Manual/Compositing_Nodes_Color#Gamma

Yes. At least, with this method, we don’t have to gamma correct a priori (with an Excel spreadsheet for example, or worse, with a calculator).

Still, the material previews will look too dark though as well as the shaded display in the 3D views. So the color selection in the color palette will look right but the material preview itself will look too dark making it difficult to balance the different nodes of a complex material. Ideally, the reverse gamma corrections on the colors (textures or materials) should be corrected just before passing them to the renderer, or the material preview and the 3D shaded view should be themselve gamma corrected. Can anyone think of a python script for doing one or the other?

Thanks PapaSmurf, I corrected a little error in the wiki post. Replaced an occurence of “radiosity” for “reflectance”. Reflectance is what tells how much light should be reflected from a given surface.

In Blender, we gamma correct the render with a gamma node before outputing to an 8-bits format.

But when using them for material reflectance etc. you should be careful to not have a wrong scaling where values go higher than 1.0.

As a rule, after ungamma correcting the colors, I also add a scaling and translating to make sure the resulting reflectance values are not lower than 0.031 and not higher than 0.9. Those values are rules of thumb that I determined from observing measured reflectances of Gretag Macbeth color charts used in photography. I find this conversion give me the most realistic GI scenes. So
reflectance = pow(color,2.2) * (0.9-0.031) + 0.031;

The problem is not the output, but the input images used for materials etc…storing 8bit in linear gamma is a bad choice, and i don’t see any feature in blender to make them linear for rendering so colors look correctly after rendering and gamma correction.

Oh. Sorry. I misread. Yeah. You are right. And you are right too that reversing the gamma from an 8-bits image and storing that back in another 8-bits image will incur loss of image data by quantization. The best, when doing the reverse gamma correction in a paint application is to store the result at least in a 16-bits file format or better in a floating point format such as OpenEXR.

I was going to add a “plausible reflectance” button originally but the API dose not currently allow buttons so I didn’t bother (thinking you could just use the RGB curve). For those that don’t know enough python to add it themselves…

from Blender import Node

class GammaCorrection(Node.Scripted):
    def __init__(self, sockets):
        col = Node.Socket("Color", val = 4*[1.0])
        gamma = Node.Socket("Gamma", val = 2.2, min = 0.0, max = 10)
        refLow = Node.Socket("RefLow", val = 0.03, min = 0.0, max = 1)
        refHigh = Node.Socket("RefHigh", val = 0.9, min = 0.0, max = 1)
        sockets.input = [col, gamma, refLow, refHigh]
        sockets.output = [col]

    def __call__(self):
        processor = lambda x: x ** self.input.Gamma * (self.input.RefHigh - self.input.RefLow) - self.input.RefLow
        self.output.Color = map(processor, self.input.Color)

__node__ = GammaCorrection

What’s wrong with the pynode I posted, it works on images you know? Alternatively it can be done with a bunch on math nodes (see attached images).

Attachments



I just did a quick test and it’s possible to enable the pynode only at render time.

from Blender import Node, Registry, Text, Scene

class GammaCorrection(Node.Scripted):
	def __init__(self, sockets):
		col = Node.Socket("Color", val = 4*[1.0])
		gamma = Node.Socket("Gamma", val = 2.2, min = 0, max = 10)
		refLow = Node.Socket("RefLow", val = 0.03, min = 0, max = 1)
		refHigh = Node.Socket("RefHigh", val = 0.9, min = 0, max = 1)
		render = Node.Socket("RenderOnly", val = 1, min = 0, max = 1)
		sockets.input = [col, gamma, refLow, refHigh, render]
		sockets.output = [col]

		sce = Scene.getCurrent()
		txt = """
from Blender import Registry

if Registry.GetKey("gammaNode"):
	Registry.RemoveKey("gammaNode")
else:
	Registry.SetKey("gammaNode", {"rendering": True})
"""
		try:
			Text.Get("renderGamma")
		except:
			Text.New("renderGamma").write(txt)
		
		if not "renderGamma" in sce.getScriptLinks("Render"):
			sce.addScriptLink("renderGamma", "Render")

	def __call__(self):
		processor = lambda x: x ** self.input.Gamma * (self.input.RefHigh - self.input.RefLow) - self.input.RefLow

		if not self.input.RenderOnly or Registry.GetKey("gammaNode"):
			self.output.Color = map(processor, self.input.Color)
		else:
			self.output.Color = self.input.Color

__node__ = GammaCorrection

Thanks alot N30N for the scripts. They are really helpful. Would it be possible to make a couple of screen grabs showing how to set up the codes in your last post as I’m not advanced enough to know how to do it correctly.

Also, what is the best way to save scripts for use in a blend file? At the moment I’m just copying the text from here and saving it in a .txt file and then loading that file into a blender text window when I need to run the script. Is there a better way of doing it?

Thanks

Hey! That’s really cool! Thanks a lot.