What is Vector Normalize?

I am generating a render for a brush to be used in sculpture mode to detail the character. I am configuring the node editor and wanted to know. The normalize vector is used for what? What is the function of this normalize?

Normalized vector is a vector that has the same direction and orientation as original but length of one. Basically you take every component of a vector and divide it by vector’s length.
As for usage, when you have two normalized vector’s, dot product of them is equal to cosine of an angle between them, it’s very basic calculation that rendering engines are doing all the time.

it divides each number in array of numbers by biggest value inside that array.
[10, 40, 75, 100] divided by 100, becomes [0.1, 0.4, 0.75, 1.0].

I’ll explain why my doubt. I have a configuration in the node editor that is rendering layers, normalize, invert, color ramp rgb linear from black to white and composite with this sequence of links to render a png image. As normalize I acted in all this. Could you explain me in a simpler way?

. . . “so that …” the resulting vectors can be meaningfully compared or mathematically processed.

For instance, consider the statistical concept of a “standard deviation.” If you were presented with two data-sets, one consisting of the weights of highway trucks (more-or-less 80,000 pounds each), and the second consisting of the weights of golf-carts, it would be impossible to meaningfully compare the distribution of their weights. (“Truck #1” might weigh 18,500 pounds more than “Truck #2.” Whereas, “Golf-Cart #1” might weigh 28 pounds more than “Cart #2.”)

So, what can you do? “Standardize the deviations.” Replace the raw-number that the scale gave you with a calculated number that can be compared. You have now created a synthetic value that you can work with: a standardized expression of each value’s deviation within the distribution from whence it came (“trucks”), specifically engineered to be comparable to values computed from other distributions (“golf carts”).

Likewise, a “standardized” vector. Whether the original vector was 80,000 units long or 80, the length of the normalized vector now occupies a consistent, agreed-upon numeric range. (Obviously, my analogy to “standard deviation” no longer holds, but the essential concept​ does: “you can’t mix apples and oranges, therefore sometimes you must contrive a suitable form of mathematical fruit.”)

Mathematically speaking, all of the now-normalized vectors have a common denominator: known to be “1.0.”

The normalize node will look for the darkest pixel value and the brightest in the image, then it will remap all the value from 0 to 1.
It’s different from clamp that will cut the value above 1.
You can use it to remap the Z pass that generally has very high values, so it look totally white in the viewer, with a normalize node everything is remapped and you can see the depth…

Hope that makes sense !

And of course it’s used in vector maths, but you don’t need to understand these to understand what it does in the compositor.

Good explanation. Even though the values might not have come from the same distribution … “highway trucks,” “golf carts” … the standardized values will fall within a consistent mathematical range: [0.0 … 1.0].

They will have been reduced to fit this range, proportionately. (Which necessarily means that a distribution which was initially “very wide” got squeezed a lot, whle one that was “always narrow” got squeezed very little.)

(Also note, I believe, that if the initial range of values was smaller than [0.0 … 1.0], it will be enlarged. It’s a “procustean bed.” https://en.wikipedia.org/wiki/Procrustes.)

But now, if you want to, say, “multiply something” by one of these values, you know that the multiplier will be in this inclusive range: [0.0 … 1.0].