I'm trying to create a material where the transparency and the colour are determined by the normal. I do this by taking the normal, modifying it with a vector curve, then sending the result either to a mixing node or the alpha channel of the output node. The problem is that the only part of the normal that actually seems to do anything, is the X-vector.

Right now, the only way to get it to work, is to run the vector into the B&W converter... Is there a better way? Also, is there any way to map the depth (distance from camera) ?

Another thing I noticed: The material node can output an Alpha value, but I don't see where you can actually get one into the material.

Finally, is there anything out there about using nodes for post-render compositing? I understand it can be done, I just haven't found how yet.