Hi all,
Does anyone know how to harness the material nodes to make a double sided shader?

I know how to do it in principal (also this works with maya hypershade and Shaderman and probably others):

You make a blue and red material for example, and you use an if switch to say that if the normals of the geometry point one way use the red material and if the other way, use the blue one. This is the escence of it.

My knowledge of blenders nodes are limited, but I know there is no boolean/logic node (although there is a patch for it in the tracker), so I don’t know quite how to acheive this.

If anyone could help I’d be much obliged

Thanks
DT

Double sided objects aren’t really possible in blender. That is to say not double sided but seperately textured sides. There may be a way to work around but I haven’t tried it so you’ll have to experiment

The idea is the same as 3D studio but you have to do all the work yourself. Take a mesh and apply a backface culling material found here. Then duplicate the mesh, flip the normals and apply the backface culling material to the opposite side.

Tricky and maybe not even possible. That would be a nice feature to have coded into blender.

I’m sure I did this in Renderman using the normals switch.

Basically what you’d do is take the dot product of the surface normal and view vector (N.I) and this gives you the direction of the surface normal - it actually gives you the magnitude of N times magnitude of I times the cosine of the angle between them but you’re only interested in whether the value is positive or negative. Then you use a conditional to color it one way or the other. You’d only need a node with two choices because you can nest them. Also, we could do with absolute and square root functions in the Maths node.

I was thinking since we don’t have conditions yet you could do something like mix(color1 + N.Icolor1, color2 - N.Icolor2)). But N.I isn’t +/-1 even if you normalize N and I. What this would’ve done is when the normal was positive, added one of the two colors to itself and when negative cancelled itself out. I’m not sure how you’d do the addition though.

With a conditional node, this should be doable.

OK, got it. Take a geometric node and put the view and normal vectors into a dot product then round the output value. I generally normalize the view and normal vectors through habit but in this case it doesn’t matter as the values get clamped anyway. Take the result of the rounding and use a mix color node. I can only assume that the rounding node always rounds either up or down. This way, if the normal is negative, it will round to 0 and if positive, will round to 1. If they are higher, they will be clamped to 0 or 1. This means that based on the normal direction, you either get all of color two or none of it. Here is the node layout:

and here are renders with the colors one way and then switched:

The object I used was a cylinder so you could see both sides at once.

See, now this is why I think there should be a collection of shaders included with blender. Very Nice. Even works with textured materials.

But not really the two sided effect Deep_Thought has suggested. And I’m talking out my hat again here so don’t look too closely. But since the effect is view dependent it isn’t consistent? Also a sphere should have only one material on the outside and one material on the inside? And if you put the material on a single plane you get the same color on both sides?

Or are you generating a backface culling effect. Which might work if instead of a color you were to (magically) somehow use a transparent something or other instead of a color. That would be useful too. But again I don’t think that’s not exactly right either since if I invert the normals on a sphere the material stays the same.

My review: Extra credit for the math. Extra credit for the insider knowledge of nodes. Extra credit for actually doing the work. Not so good for thread topic. I give three cool smileys out of five.

Marty_

It seems to be the case here. I guess the normal vectors are calculated at rendertime so it doesn’t relate to the normals in the 3D view. The difference in colors on the forward facing sides was just the rounding not working right - scaling the view and normal vectors eliminates this and you get just forward facing one color and backwards facing another. This is why I prefer programmable shading because I can actually output the values per pixel.

It’s weird though. I googled for a double-sided renderman shader and found one that pretty much does what I describe:

``````RSLFunction {
void pxslJackFlexibleDoubleSide (color f,b;
float sw;
output color res;)
{
extern normal N;
extern vector I;
if(N.I &lt;= 0 && sw == 0)
res = f;
else if(N.I &lt;= 0 && sw == 1)
res = b;
else if(N.I &gt; 0 && sw == 0)
res = b;
else res = f;
}
}
``````

This just has a switch (sw) that allows you to easily reverse which side the colors are on, which is equivalent to switching node input colors. Maybe Renderman’s normal calculation differs from Blender’s. I didn’t think so because I’m sure it uses du x dv at rendertime. I’ll need to check this in Renderman again.

edit: Hmmm, it definitely works in Renderman with a plane and cylinder and is independent of the viewing direction just using N.I.

AHHHAAAAA!!! You have to turn off vertex normal flipping. Stoopid Blender obscure options. Also, like I said because there’s no conditional node, you should remove the normalization and instead use a vector mapping node to scale the normal and view vectors really high so that the rounding clamps to 0 and 1 better and avoids artifacts.

Here’s a new image of the nodes:

I set the scale in the mapping node to 1000, 1000, 1000.

Also, the vertex flipping button is in the editing panel. It’s under the double-sided button and called No V.Normal Flip.

i bow to your understanding. care to be a wiki author? the material nodes are in need of an update and practical example set…

All right, well here I am barely out of a comatose state using words again. You’d think.

Nice try but still not quite there? I can’t seem to get the red side to show except in the shadow region. All else is just black. Maybe I’m missing something. Turn on ‘No V Normal Flip’, change Normal node to Mapping node, set size to 1000 in xyz. The only difference I can see between my setup and the image above is the Geometry node has an input labled Col: just below the input labled UV:. In the image the Col: input is missing. I’m using a blender built yesterday morning in Windows2K.

There is one sunlight in this scene, from above and to the left.

And, because it’s 5 in the morning and my brain is crawling out of a pleasantly unconcious state, I double check everything and still don’t see what is going on. Probably because I’m still trying to shake off that dreamy nymphette that refused to leave me alone all night.

If I can contribute something useful, sure, but remember this is not a very complex shader at all. I have already done a short modelling tutorial on one of the wikis. I tend not to visit the wikis much because I lost my bookmarks to them. There should be links from this site.

It seems the lighting calculations use the unflipped normal so it looks like it shades it in the opposite direction from the light. Unfortunately, it uses the correct light direction for the other side. So, if you don’t flip normals, Blender can’t tell the difference between either side and if you do it messes up the lighting calculation. As far as I can see, the node layout is correct for a double-sided shader but the renderer isn’t rendering things correctly. Sometimes I wish they’d deprecate the internal renderer and just use a renderer like Pixie or aqsis. It’s not as if Renderman will ever stop being developed and there are at least 3 free cross-platform renderers. Plus, they support a rich shading language already, of which nodes are just a subset and you wouldn’t need yafray. One renderer would do everything.

Then, whenever the nodes weren’t up to the task, there could be a .sl input node that allowed you to use a programmable shader. The core development team could focus on fancy stuff like cloths, fluids etc. and all they have to do is make sure the output complied with the specification. You get true DoF, true motion and deformation blur, SSS, GI, caustics. People have been waiting for so long for these things and it’s readily available. Renderers like Pixie, aqsis, 3delight, povray, etc. are nothing without a good modelling and animation program and Blender is not worth much without a solid and flexible renderer. Some people say the answer is not as simple as it looks but it really is.