So in my checker texture example above - blender doesn’t have any clue about the max and min values until the entire image is rendered - even though they are explicitly stated in the texture node itself (i.e. 0.3 and 0.7)?
As I said, sometimes is pretty easy to find the limits, even without a calculator. But I repeat: Things get nasty pretty quickly, and it´s difficult to prevent the opening of the pandora’s box.
Substance Designer has an auto levels node that automatically takes an input image and scales it to 0 - 1. Not sure how it’s done but it never takes more than half a second to compute so I assume it’s not that computationally intensive.
Try feeding two musgraves into the checker color. It’s stated above it is between -1 and 1 - that is also not true. It varies greatly (I’ve seen ±40000?) depending on the settings you give it. I get crazy when I see it plugged into diffuse untreated. A color ramp will effectively clip it, but you’ll loose the details you’re clipping of course.
As I said, you really want to adjust min and max while observing you adjust it to safe values wrt clip. I had a preview mode in my personal one (although it was only normalize, not remap). Normalize will force 0-1 range, where you’d have to do a fLerp afterwards for rescaling. Remap does it all in one go, doing normalize if output is 0-1.
It is also useful doing math operations; maybe you loose track of what is going on and need to rescale the value. Of course a clipping preview node group is required to find those values. Another case is doing log/trig; you may be working in a certain log scale, reduce it to 0-1, then remap that to ±pi/2 or ±pi for trig operators. Once done, you can rescale the -1+1 range back to 0-1 for further visual processing.
It is an immensely useful node, and I’ve used normalize tons of times. My only wish is that it should have a show clipping switch (or dedicated preview output) where it only outputs three values; 0 for n<0, 0.1 for 0<n<1, and 1 for n>1. I chose 0.1 for being easy to see in display transforms, but you get the idea.
It’s not doing it in a shader, it’s doing it to textures.
Remember that everything that happens in a shader node is executed millions of times, if it only takes 1/100th of a second, but you do it 1000000 times, that’s 3 hours.
Conversely, texture nodes, can grind through whatever they are calculating, then output their image files and be done with it, even if it takes 20 minutes, you get your textures and then you don’t need to do it again for the next pixel or sample.
Yes, I’d like to know that too. Reading the topics about E-Cycles I read about some differences here and there.
Sorry. Didn’t realize I was in the wrong thread. Delete your replies as well to avoid confusion. Cheers.
The new Vector Math operations from Omar is now in master.
Multiply, Divide, Project, Reflect, Distance, Length, Scale, Snap,
Floor, Ceil, Modulo, Fraction, Absolute, Minimum, and Maximum
There is a compatibility related change through the removal of the value output from vector nodes, but that is handled in versioning code.
White Noise Node (from Omar)
https://developer.blender.org/rB133dfdd704b6a2a4d46337696773b331a44304ea
Volume Info node (also from Omar), this node primarily deals with accessing information from things like smoke simulation
https://developer.blender.org/rBe83f0922011243a0085975fe41930ed34bb6c009
With all the nice node additions to cycles here, can we finally have the SVM stack usage of the current material displayed within the node editor?
Oh does the white noise mean that we can finally easily create random values?
The volume info is simply an easier way to get what used to be accessed via the attribute node right?
Yep. Feels less strange.
We could in the past. Problem was you had to run the input through a noise (42.5,0,42.5 my defaults), separate HSV, and use the H output. Other components can be used too but hard to control accurately.
Another one of Omar’s patches, the Object Info
node now allows you to access the object color attribute.
https://developer.blender.org/rB08ab3cbcce1eb9c2de4953a83b50cabc44479d3c
Unlike vertex colors, this is far easier to use as a property that it global to the entire object and can be used to manually add variation to any object’s shading.
This sounds very useful! Fewer materials, more variation!
I’m wondering: why is the default Cycles tile size still 64 x 64? Many tests on this forum point out that a tile size of 32 x 32 seems to yield a generally optimal rendering speed.
That’s really variable from scene to scene. Sometimes 16 or even 8 is better, sometimes 128 and 256 even on cpu are better.
GPU render still suffers a bit with very small tile sizes mainly if Denoising is enabled.
The amount of materials is the same. You just have one more way to introduce some variance which is very welcome. Especially because with color you get 3 float values unlike object id which only gives you one integer.