Cycles Development Updates

Yep - that is my suspicion too. Interestingly the RGB curves node doesn’t appear to clamp the input value.

If the colour ramp node didn’t clamp the input value - in linear mode it would in fact act just like a math ‘normalise’ function.

yes could be.i did not noticed or knowing this colorramp behavior before.the best would be to make the clamp optional like in the other we have the choice between clamped and unclamped.
or as sayed a normalize math node would be nice,for every case.

Normalization seems like it would be a complicated function for a shader. @brecht, do you have any info on the feasibility of this?

I’ll look into making the output of the voronoi texture normalized.

Adding an automatic normalize operation is not possible to do efficiently, there is no way to know what the minimum/maximum values of the node network linked into it will be.


Perhaps add a lerp node and leave it up to the user to select appropriate min/max values? Sure you can do it right now with a couple of math nodes , but it’s not super intuitive for artists

1 Like

Yes, we could support the Map Range compositing node for shading.


Please do. Anything to reduce the number of node groups we need for trivial things is a positive!


and threating the datas like bitmaps/image ect ? its datablocks could be used for min max and the function.
something simple like this?

in theory you could feed in every datablock you want.first get the min and max value from the datablock,and then start the function

But when pathtracing ‘every datablock’ is millions of paths, bouncing millions of times, shaded by any possible nodetree. You’d essentailly have to render the entire scene, then do the normalization, and even then it wouldn’t work, since you would need the shaders that use that normalization function to contribute to the scene itself.

As far as I understand it, Normalizing is functionally impossible in a path tracer shader tree.

hm not sure about this.

i have another idea .the normalize function with input value slots,like in the other math nodes,there you could put in min and max value you want for yourself.this way you dont need a data block and you could normalize it right now,with the simple formula.

you only have to get the values with the nodewrangler preview ect

Sounds like @Brecht is aware of it. This is the map range node he refenced in post #300(Cycles Development Updates):

You can set the max and min, for both the input and the output.


yes nice,this should work for most cases, if you know the min max values.

1 Like

Brecht, i noticed when i try to mix two sss shaders (random walk with the other one) the end result tends too dark, much darker than each of the shaders separately or their average. Is it a bug, or a todo thing?

Mixing the shaders will work properly if you use the Branched Path Tracing instead.

Though Brecht wants to unify the integrators eventually. When it happens though, it should hopefully not include that energy loss issue (this also occurs when two Random Walk shader nodes are mixed together).

1 Like

Thanx for the tip. Will it still work with 1 sample per branch? While i use BPT quite often, it converges faster with less spp, it tends slow down a lot as the scene gets very complex. My Seahorse scene wont even start rendering with BPT, gives cuda error right away.

Surely that doesn’t matter - at least for value or colour inputs.

All you do is take the maximum input value (x) - perform the calculation y=1/x - then multiply the entire input by y.

You then end up will all input values normalised in the range 0-1.

I’m pretty sure it wouldn’t be that easy for any type of data that isn’t an image.

In an image file, all of the values are fixed and you can get the brightest pixel. Procedural textures meanwhile are done using math and the resolution is infinite. Then you have the idea that it could’ve been operated on and mixed with other data before it gets to any normalize node.

I’m pretty sure we can count on Brecht to have enough in-depth knowledge to determine the viability of things.

1 Like

In my own custom normalize I have measured min, measured max, and preview on/off (I don’t have output scaler, but would probably be nice). When preview is enabled, output < 0 is black, output > 1 is white, and anything between is gray. Preview is very important when tweaking or even when bugfixing a complex node setup.

Whilst that is true - a procedural texture ultimately outputs either colour data or a fac.

At the point it does this you know what the output value is and hence can normalise it prior to passing it to another node (or you could parse it through a node that could perform the normalisation - like the math or colour ramp nodes).

That would be good as it would make it backwards compatible with the old Voronoi node.

However another solution would be to remove the apparent clamping that is happening in the colour ramp node (or make it a tick box option like in other nodes).

This might actually be more useful since it’s not intuitive that the colour ramp node should clamp the input value. The fact that it appears to would also affect other inputs - some of which might legitimately be outside the 0-1 range.

Until this issue - I was always under the impression the colour ramp node was unclamped and simply remapped any input values to the specified colour ramp.

1 Like