I’m embarrassed to say that I’m just getting around to learning nodes in Blender. They are great. I can’t believe I ever made anything without them.
The question:
The material nodes seem to be able to assign a color and alpha value only. That value goes on to be shaded by the lights for the final image. I came from a Renderman background where the programmable shader determined the color and the brightness based on the lights and whatever else you see fit. I don’t see a way to do that with Blender nodes. Am I just overlooking something, or does this functionality not exist? If it doesn’t exist, are there plans for such a thing?
I imagine there would be nodes for lights and shaders. The light nodes could take inputs such as direction and distance and return an intensity/color. The shader nodes would have inputs very similar to the current material nodes with the addition of the light location/direction and color/intensity. The nodes would then return the final shaded pixel values. Actually, the total functionality I’m thinking of overlaps extensively with what the material nodes already do. The difference is the addition of a light loop to calculate shading rather than base color only.
The addition would be an enormous benefit. Any thoughts?
This is a problem I’ve been trying to figure out myself.
At the moment (in my off hours from Peach) I’ve been working on a new shader node that simply provides direct access to Blender’s shaders (blinn, phong, lambert, oren-nayar, etc.). Its inputs are all the parameters for the selected shader, and the output is the resulting color after looping through all the lights. I’m planning on also adding a light group field to the node, so that you can limit the shader to using lights in a particular group.
Perhaps I could also make a light node where you enter the name of a single light and it provides the incident vector and light color/intensity?
Would these sorts of tools would let you do what you want?
Ah, excellent! I was going to have a go at this myself, but it’s low down my list of priorities, please go for it! I was also thinking about API-ifying out some of the raytrace code to make raytrace nodes, for things like ray reflection, refraction, and maybe provide some RSL style things like an ‘occlusion’ node that takes a vector/solid angle input and returns 1/0 if it’s occluded. Raytracing, especially refraction, within material nodes is currently pretty dodgy, like much else of material nodes right now…
Other thing it would be great in the Material nodes is a Time input node… just like in the Compositor. It would be allot more faster animate a Material with a Time Node…
Thats difficult to code? impossible? (please don’t say it’s impossible )
If the shader nodes being discussed were added and you implemented the raytrace features you mentioned Blender would take another huge step forward in terms of professional appeal. I think technical directors like to know that the features they need are supported. I think they also like to know that when they hit a wall they can whip up new “features” using programmable shaders if they need to.
It’s good to know I’m not the first one to think of this. That means it’s got merit. The description on the wiki sounds pretty good and seems well-thought-out. It’s also very promising that at least several people currently want this and are working toward it.
what you talk about is a true material node system where you have nodes like surface base shaders where you add properties like blinn … and add nodes like raytrace for example into the surface body color for reflection …