It should be possible by connecting image texture to vector math normalize node. However, it doesn’t work.

Normalize scales input to 0-1 values. Ideal for gpu autolevels. May save a lot of time if get it to work.

The normalization of vectors doesn’t work with colors. The function divides the input vector by its length (magnitude), so that the end vector is an unit vector. With colors the result is meaningless since using the rgb values doesn’t give us any length… Colors don’t have lengths, and whatever comes out from the normalization will only have the following characteristic: sqrt(R*R+G*G+B*B)=1.

So, how i do calculate length for value?

You need to define its scale. A value of 1.5 can mean 0.5 in the interval [0, 3], so you need to know the limits before performing any conversion. Normally, one has to read all pixels from a texture, and store the lowest and highest value, and then make the conversion.

But for that i need millions of nodes! I can calculate average value by scaling, but lowest and highest… Is there any easier way?

Well, maybe not millions. Say: enlarge texture by 2, move by a half, minimum or maximum first and second, repeat.

I think you’re not understanding the problem itself. The problem is still the same with 1 node or ‘millions’…

If you don’t have your set of end values, you cannot find its limits. For doing it, you need either Calculus to analize your equation, or to build a big set of results from the equation.

So my advise is: take a bit of time to read some books about calculus.