Normalize any value to 0-1

Hi, i need to be able to map any value to 0-1 in Cycles nodes, in order to feed ramps and mix nodes for example.
We have Normalize node in the compositor that just makes the job, but not in Cycles.

The math should be:
X = X + (0 - Y) where Y is the lowest value in the range
now I have a positive range starting from 0
then
X= X * (1 / Z) where Z is the highest value in the range.

The problem is, how do I find min/max value on a texture? Thhe math node it’s no use since it makes a comparison between the 2 inputs.
Let’s say i have an image in which pixel values ranges from 0.1 to 0.8. I need just 2 values: the darkest (Y) and the brightest (Z) pixel.
Is there a way to find them?

edit:
contextually, this can be seen as a feature request: bring Normalize node into Cycles!!

2 Likes

The problem is that we cannot know the full range of samples until the render is finished. :frowning:
If for example after 1000 samples going from 0 to 1, we get a sample of 10, then we could assume the range is from 0 to 10. But then if right after we get 100 or -10, we need to reset the range limits… and so on.

Unless we clamp the values, I don’t ser any other way.

you could always multiply by the S curve function
but this might change some values depending on the max value you use !

happy bl

Maybe reading the node graph from left to right at the beginning could do the job?

I’m using the below node group to normalize “any” value. As Secrop said, during render you don’t know what the min and max will be, so preview the value with preview activated and adjust the min and max output until you get only a grey value, then turn off preview. It will show black for values below zero and white for values above one when preview is activated. I made it mainly for musgrave because it can turn out some rather extreme values if not careful, but I sometimes use it when I loose track of the maths as well.

It has to be done with python (or C)… and it’s not an easy algorithm either, although doable. :thinking:

The only problem is that you’d get the theoretical limits of some value but not the exact min/max values that are being used (thought the min/max values are inside the limits).

Getting the min/max values of a channel of an image is something you have to do by looking at the first pixel, then looking at the second pixel, etc, then looking at the last pixel. You really don’t want to be doing it in a Cycles nodes graph for every single sample. (It is technically possible but the absence of any kind of nodes loop structure would make it very, very tedious.)

I’m also having trouble seeing where it would be useful, outside of some tone mapping stuff that’s better done in compositing anyways.

Really, if you’re not using the full, 0 to 1 range of your textures to begin with, you’re losing potential color depth. It wouldn’t be crazy to auto-normalize all of your images in something like GIMP. Write a 0.5 pixel someplace out of the way to calibrate and use that to remap your image values. Then, you know exactly what the range of every single image is: it’s 0-1. Untested for color data, but it’s basically what I do with things like roughness maps: paint them in the complete 0,1 range and remap that in nodes.

I’m using my normalizer node frequently, I find it very useful. Even if musgrave (even it’s default values clip in both ends) is great for bump maps on its own where clipping isn’t a thing, it becomes very important to normalize it if you want to have control over its values down the chain. I.e. trying to smoothly flatten its top using a contrast (perlin version, not the builtin contrast node) group node rather than clipping it for a sharp transition. Plus some nodes clip or does completely unexpected things if data is not in 0-1 range.

Example of musgrave going out of control - something it can easily do when experimenting with its not so intuitive settings (I used object coords on a floor plane):
Type: Ridged Multifractal
Scale: 5
Detail: 16
Dimension: 7
Lacunarity: 0.99
Offset: 1.9
Gain: 1

Use that as a diffuse shader color or add contrast to it. And people make tutorials hooking up this generator directly to a shaders color input? Yeah I know this is a rather extreme example. Using the normalizer, I find the values at approx 40 and 104, turning off preview I now I have something useful (and very unique - I used object coords for a reason - a reason I have not seen any tutorial mention).

I don’t see the purpose of normalizing an existing albedo image, although you could use the same nodegroup. Doing it in gimp you would just introduce stepping (as would the node group of course), so you should probably scale it up first then apply a blur, probably loosing some detail in the process. For non-color textures within legal range, the color ramp will suffice. Only if doing spatial modifications (more contrast in this area, controlled by another input), would you need something normalized.

I didn’t say that remapping a value isn’t useful-- I gave an example of a place where I regularly remap values. My “remap” node group is my most re-used node group out of all that I’ve created. If I could encourage devs add any node to out-of-the-box Cycles, it would be a linear remap value node. (Maybe it would be smarter of them to rework their curves node instead, to be more intuitive outside 0-1 range/domain, to allow handle coord inputs, to not be so explicitly RGB or vector when it really doesn’t matter; after all, it’s all just numbers.) And normalizing a value with known(ish) values is just remapping values from min,max to 0,1.

But identyfing the min and max values of an image-- that’s where I’m having trouble seeing the point. You’re not doing that in your Musgrave example. Maybe on .hdrs? But even then, they’re typically already normalized, at least, roughly.

What if you normalize the input in advance, and then potentially re-scale it back to its original range if needed when rendering? That at least moves the problem around.

Yes, normalize/remap, very similar. So far I’m mostly remapping to 0-1 for previewable (viewer node) values, but I guess remap to ±pi for trig or 1-something for log could be useful. But the preview clipping thing is very important. Although I guess that could be a mutable separate node group. Also, I think clip highlighting should be available in image editor and compositor as well.

I am doing it to the musgrave example. With those values, I use the normalize node I posted above with the preview slider set to 1, and tweaked the numbers until I got no clip shown at 40 (lowest value) and 104 (highest value). Would be same as remap 40 to 0 and 104 to 1. But I wouldn’t be able to “find” those values without clip preview (black = less than zero, white = above one, fixed grey = between zero and one). Previewing that node group with preview enabled, you can tweak those numbers until everything is fixed grey.

Curves could be improved, yes. But it couldn’t possibly replace the ease of use of a builtin remap function, preferably with clipping preview mode (possibly as a separate output - I have several utility node groups with preview functionality, ref some utility nodes).

I don’t know about HDRs, but they sure tend to be brighter than white, otherwise what would be the point? :smiley: You wouldn’t remap or normalize an HDR, but as a quick hack for clipped sun you can remap a top range of it to get sunlight like strengths from it. Mostly as a temporary preview as the “sun disk” would be the full corona (everything within is clipped). And that will not be for all HDRs.

Was remap part of GSOC node upgrade that may come? Can’t remember.

I just realized this is a thread from the grave.

Lol, yeah, but what’s it matter…

My own remap looks pretty different than yours. Yes, when I need to know the range of something, I’m generally doing a greater than->emission or something like that. But I don’t think musgrave actually has any hard ceiling. Maybe I’m wrong, but I think higher values are just increasingly uncommon. So literally grabbing the maximum of the musgrave function over its entire domain would, I think, just give you infinity. Well, in math world, not necessarily in 32-bit-float computer world.

My remap has inputs for value, sourceMin, sourceMax, destMin, and destMax. Yeah, 90% of the time, I’m remapping from 0,1 or to 0,1. But also, -1,1 for dot/sin/cos, 0 to 2*pi for radians, stuff like that. Sometimes weird numbers like 3^0.5, for mapping UVs into a space appropriate for creating hexagons procedurally. And sometimes even for variable source/destination min/maxes, which is occasionally pretty cool, although I’m not sure of an example. Of course, all of this stuff is doable with math nodes-- that’s how I made the group-- but thinking about things in terms of remaps is a lot more intuitive for me than thinking in terms of, okay, add this, now multiply by this…

I don’t really know the best way to get inputs onto things that don’t have them-- the UI of it all. That’s an issue for curve handles, but also for mapping nodes (where I also tend to use my own separate rotation node groups, and do my scales with wrong-feeling mixRGB multiplies and my translates with vector adds, just so I can use variable inputs.) But if somebody has a good UI solution, a remap is just an RGB curves node: f(sourceMin)=destMin, f(sourceMax)=destMax, f(x0)=y0 and f(x1)=y1, with the added advantage that it can be a linear remap if you want but it doesn’t have to be linear. Yeah, a clamp option, including clamp range, would be nice too, but that’s a pretty easy node group to make. I don’t think a preview range is essential-- most of my remaps, I know exactly what range of inputs I can expect, so I know exactly what range of outputs I can expect.

The HDRs that I’ve looked at, looking at emission for value>1.0 or R>1.0 or whatever, are pretty much black :slight_smile: It looks like what happens is that people rely on the increased color depth to encode more into the lower end of the values, rather than expanding into the >1.0 range. It doesn’t seem like there’s any kind of standardized units used, like 1.0 = 1000 nits. Although, yes, sometimes a little bit of >1.0 range is used. And I haven’t done an exhaustive study or anything, just what I’ve noticed when I’ve bothered looking.

I guess I can see how knowing a literal min/max might be useful. It is useful for debugging and learning, after all.

There’s a min() and max() function that can be used for this…

import numpy as np

imagedata = np.array(bpy.data.images['YourImage'].pixels)

max_red = max(imagedata[::4])
min_red = min(imagedata[::4])
max_green = max(imagedata[1::4])
.....
max_alpha = max(imagedata[3::4])
min_alpha = min(imagedata[3::4])

Then is just a matter of applying the nodes functions to these limits… for example, if you add [0.5,0,0] to the image color, the result min/max of the red channel will be r_min, r_max = min_red + 0.5, max_red + 0.5.

Yeah, the remap is in there.. I really really REALLY hope this makes it. If there was ever a thread that could make me cry, this is it :smiley:

Thanks, that might be handy to know. (Thought of one potential use, which is normalizing depth maps, although it’s probably still something I’d do in compositing.)

If you all need is the min and max value of a texture, you could do this with a Math Node.

In Blender 2.8, the Math Node comes with some new operations. It includes Floor and Ceil operations which give you the min and max value respectively. When you use these operations, it doesn’t matter what value you have on the second input. It always operates on the first input.

Similarly ‘Sine’ and few other trignometric functions have no effect on the output even when you change the second input slider. In the GSoC 2019 Cycles Procedural branch, ‘dynamic socket hiding’ is being implemented. So whenever you use an operation that requires only one input, the second input socket is automatically hidden.

And by the way, if you want to normalize any set of values from 0 to 1, you could just plug it into a Color Ramp node. The output of a Color Ramp is by default from 0 to 1. You wouldn’t need a Normalize node for this. If you want to remap values to a custom range, you could use the Remap node in the above mentioned GSoC Cycles Procedural branch.

Nope! Color ramp node is one of the most misused nodes in tutorials. In it’s default linear mode, people drag left and right sliders to “increase contrast”, when all that happens is clipping away the details. There are better ways to do contrast, although contrast node is not one of them as it has its own problems. I’m using bias and gain but requires normalized values.

Try the musgrave settings I showed further up. I’m betting you can’t get anything useful from it using a ramp. You can change color to high numbers but not negative ones, and there is no way of knowing that the white is actually 50,50,50 i.e. Properly normalized, I can apply bias and gain.

Example of default musgrave (only scale increased) settings. Which due to its nature of being less than zero and more than one will clip just passing it through a colorramp:

Yes, you can achieve this using curves, but curves won’t let you modulate points on a per pixel basis, which can be useful if you want to vary the mortar on the brick texture i.e. Edit: Not strictly true I guess; you can modulate the fac, same thing. For modulating more pointwise I use smoothstep/smootherstep, and that one you cannot do with curves alone.

1 Like