Normal Map Node

I was reading over this thread and decided to try to figure out the nodes systems so I converted the normal map node from a compositing node to a texture (shader) node.

Don’t know how useful it really is but it kept me busy for a couple hours.

If anyone cares I can upload the patch but I need to figure out a better name for the input and output than ‘image’ since I don’t think any other texture nodes use that name.

Pretty sure I’m not using it right but here’s some pretty pictures…

With

http://www.trollwerks.org/NormalMapNode1.jpg

Without

http://www.trollwerks.org/NormalMapNode2.jpg
http://www.trollwerks.org/NormalMapNode3.jpg

And the ubiquitous Lenna both with

http://www.trollwerks.org/NormalMapNode4.jpg

And the nodes

http://www.trollwerks.org/NormalMapNode5.jpg

Man, the guys from this threadwill become crazy about it!! Good job! This could mean a great improvement in texturing, as it could solve some serious issues that blender has with procedural bump mapping. can´t wait to try it out!

Thanks a lot!

Can you post a build online?

Cool! I couldn’t get some of the filter class composite nodes to work with tex nodes,
but I read that Frr fixed that. Have you tried to use a texture you’re
painting on with it? Does the painting update?
If so then you could probably sell those builds :smiley:

It would be great to have blur material node if it’s possible.

This would be invaluable especially if we used the output of a texture node tree as an input from the texture node to the normal map node and we can have any combination of textures that we want combined into a single normalmap.

All that’s really needed is an option to adjust the normal mapping strength.

Thank you for taking the time to develop Blender and adding needed features like this.

Have you tried to use a texture you’re
painting on with it? Does the painting update?
If so then you could probably sell those builds

To do that he would need to look at the GLSL code and make that node supported by GLSL and the BGE. If he did that then the BGE users will have a treat as well.

I don’t really know how nodes work but this seems to make its own copy of the converted image each run so it would probably be pretty slow if used realtime.

If it could be turned into a pixel filter that works like *nix pipes then it should be speedy enough, I’ll look at some of the other shader nodes to see if I can figure out how they work.

Anyway, here’s the code. I had to cheat a little and include CMP_util.h as an easy way to get the struct CompBuf that it uses to store the image data. All I really did was to change around some names so it shows up as a material node, my c coding skills are about as good as my Spanish – I know just enough to get by.

Index: source/blender/blenkernel/BKE_node.h
===================================================================
--- source/blender/blenkernel/BKE_node.h    (revision 19002)
+++ source/blender/blenkernel/BKE_node.h    (working copy)
@@ -233,6 +233,7 @@
 #define SH_NODE_COMBRGB        121
 #define SH_NODE_HUE_SAT        122
 #define NODE_DYNAMIC        123
+#define SH_NODE_NORMAL_MAP    124
 
 /* custom defines options for Material node */
 #define SH_NODE_MAT_DIFF   1
Index: source/blender/blenkernel/intern/node.c
===================================================================
--- source/blender/blenkernel/intern/node.c    (revision 19002)
+++ source/blender/blenkernel/intern/node.c    (working copy)
@@ -2873,6 +2873,7 @@
     nodeRegisterType(ntypelist, &sh_node_seprgb);
     nodeRegisterType(ntypelist, &sh_node_combrgb);
     nodeRegisterType(ntypelist, &sh_node_hue_sat);
+    nodeRegisterType(ntypelist, &sh_node_normal_map);
 }
 
 static void registerTextureNodes(ListBase *ntypelist)
Index: source/blender/nodes/SHD_node.h
===================================================================
--- source/blender/nodes/SHD_node.h    (revision 19002)
+++ source/blender/nodes/SHD_node.h    (working copy)
@@ -62,6 +62,7 @@
 extern bNodeType sh_node_seprgb;
 extern bNodeType sh_node_combrgb;
 extern bNodeType sh_node_hue_sat;
+extern bNodeType sh_node_normal_map;
 
 #endif
 
Index: source/blender/nodes/intern/SHD_nodes/SHD_normalMap.c 
=================================================================== 
--- source/blender/nodes/intern/SHD_nodes/SHD_normalMap.c    (revision 0) 
+++ source/blender/nodes/intern/SHD_nodes/SHD_normalMap.c    (revision 0) 
@@ -0,0 +1,134 @@ 
+/** 
+* 
+* ***** BEGIN GPL LICENSE BLOCK ***** 
+* 
+* This program is free software; you can redistribute it and/or 
+* modify it under the terms of the GNU General Public License 
+* as published by the Free Software Foundation; either version 2 
+* of the License, or (at your option) any later version.  
+* 
+* This program is distributed in the hope that it will be useful, 
+* but WITHOUT ANY WARRANTY; without even the implied warranty of 
+* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the 
+* GNU General Public License for more details. 
+*  
+* You should have received a copy of the GNU General Public License 
+* along with this program; if not, write to the Free Software Foundation, 
+* Inc., 59 Temple Place - Suite 330, Boston, MA  02111-1307, USA. 
+*  
+* The Original Code is Copyright (C) 2006 Blender Foundation. 
+* All rights reserved. 
+*  
+* The Original Code is: all of this file. 
+*  
+* Contributor(s): none yet. 
+*  
+* ***** END GPL LICENSE BLOCK ***** 
+ 
+*/ 
+ 
+#include "../SHD_util.h" 
+#include "../CMP_util.h" 
+ 
+/* **************** Normal Tools  ******************** */ 
+   
+static bNodeSocketType sh_node_normal_in[]= { 
+    {    SOCK_RGBA, 1, "Color",        0.0f, 0.0f, 0.0f, 1.0f, 0.0f, 1.0f}, 
+    {    -1, 0, ""    } 
+}; 
+static bNodeSocketType sh_node_normal_out[]= { 
+    {    SOCK_RGBA, 0, "Color",            0.0f, 0.0f, 0.0f, 1.0f, 0.0f, 1.0f}, 
+    {    -1, 0, ""    } 
+}; 
+ 
+static void do_normal_map(CompBuf *stackbuf, CompBuf *cbuf,float *vec) 
+{ 
+    int x, y, sx, sy; 
+    float *out= stackbuf->rect; 
+    float *xright,*xleft,*height; 
+    float *ytop,*ybot; 
+    float sum; 
+    int dxp,dyp,dxn,dyn; 
+ 
+     
+    sx=cbuf->x; 
+    sy=cbuf->y; 
+     
+        for (y=0; y < sy; y++) { 
+            for (x=0; x < sx; x++,out+=stackbuf->type) { 
+ 
+                height = compbuf_get_pixel(cbuf, vec, x-cbuf->xrad, y-cbuf->yrad, cbuf->xrad, cbuf->yrad); 
+ 
+                //find the tangent vectors in x and y directions 
+                //check corners and sides 
+                if (x == 0) dxn = 0; 
+                else dxn = -1; 
+                if (y == 0) dyn = 0; 
+                else dyn = -1; 
+                if (x == sx-1) dxp = 0; 
+                else dxp = 1; 
+                if (y == sy-1) dyp = 0; 
+                else dyp = 1; 
+ 
+                    xright = compbuf_get_pixel(cbuf, vec,x+dxp-cbuf->xrad,y-cbuf->yrad, cbuf->xrad, cbuf->yrad); 
+                    xleft = compbuf_get_pixel(cbuf, vec,x+dxn-cbuf->xrad,y-cbuf->yrad, cbuf->xrad, cbuf->yrad); 
+                    ytop = compbuf_get_pixel(cbuf, vec,x-cbuf->xrad,y+dyp-cbuf->yrad, cbuf->xrad, cbuf->yrad); 
+                    ybot = compbuf_get_pixel(cbuf, vec,x-cbuf->xrad,y+dyn-cbuf->yrad, cbuf->xrad, cbuf->yrad); 
+                 
+     
+                //calculate the difference in height 
+                out[0]=(xright[0]+xright[1]+xright[2])- (xleft[0]+xleft[1]+xleft[2]); 
+                out[1]=(ytop[0]+ytop[1]+ytop[2])- (ybot[0]+ybot[1]+ybot[2]); 
+                             
+                sum = sqrt(out[0]*out[0]+out[1]*out[1]+1); 
+             
+                //convert to RGB, scale RG to (-1,1), B to (0,1) to get normal map 
+                out[0] = ((-out[0]/sum)+1)/2; 
+                out[1] = ((-out[1]/sum)+1)/2; 
+                out[2] = 1/sum; 
+                out[3] = 1; 
+ 
+            } 
+        } 
+         
+} 
+static void node_shader_exec_normal(void *data, bNode *node, bNodeStack **in, bNodeStack **out) 
+{ 
+    if(in[0]->data)  
+    {         
+        /* make output size of input image */ 
+        CompBuf *cbuf= in[0]->data; 
+        CompBuf *stackbuf; 
+                 
+        cbuf= typecheck_compbuf(cbuf, CB_RGBA); 
+ 
+        // allocs 
+        stackbuf= alloc_compbuf(cbuf->x, cbuf->y, CB_RGBA, 1); 
+     
+        do_normal_map(stackbuf,cbuf,in[0]->vec); 
+         
+        out[0]->data= stackbuf; 
+             
+        if(cbuf!=in[0]->data) 
+            free_compbuf(cbuf); 
+    } 
+} 
+ 
+bNodeType sh_node_normal_map= { 
+    /* *next,*prev */    NULL, NULL, 
+    /* type code   */    SH_NODE_NORMAL_MAP, 
+    /* name        */    "Normal Map", 
+    /* width+range */    140, 100, 320, 
+    /* class+opts  */    NODE_CLASS_CONVERTOR, NODE_OPTIONS, 
+    /* input sock  */    sh_node_normal_in, 
+    /* output sock */    sh_node_normal_out, 
+    /* storage     */    "", 
+    /* execfunc    */    node_shader_exec_normal,     
+        /* butfunc     */       NULL, 
+        /* initfunc    */       NULL, 
+        /* freestoragefunc      */ NULL, 
+        /* copysotragefunc      */ NULL, 
+        /* id          */       NULL 
+}; 
+ 
+

Maybe someone who actually know what they’re doing can look it over.

Don’t know about including this in the build I maintain since I think it’s going about things just wrong but maybe if I can figure out how to do it properly.

Pildanovak was thinking a while back of converting greyscale bump maps on the fly into normal maps,
there’s probably a way though. This is a totally different scenario, and I’m not positive, but I think Cambell
broke up the textures into different parts so it didn’t have to update the entire
image constaly which sped things up. I’m pretty sure the nodes aren’t linked up to
be updated though. I remember I had to toggle things in the modifier stack to update :smiley:

I’ve been looking over the other shaders and have an idea how they work.

Just need to find the opengl function for ‘convert_image_to_normal_map()’, which I’m guessing probably exists somewhere, and this should be speedy enough for realtime use.

Someone with real skills could probably figure out how texture painting does it and add a button to convert a greyscale to normal map on the fly so as you paint you can see what’s happening. Easier said than done though, right?

I’m actually kind of amazed that blender shows normal maps realtime in shaded mode since the video card in my laptop doesn’t seem to have that functionality…I’m lucky if texture painting works, AFAICT it depends on the weather or planetary alignment or something.

That’s pretty cool, so bumps with with no graphic card?
I see now that thats a shader node, I thought it was a text node. :smiley:
Thats great for a non glsl option in the games start menu, this way if a person
with a bad computer gets your game it’ll still look good.

This is great news!

About realtime feedback with nodes, in the mentioned thread I posted an image (which now is gone) that showed a test I did painting into a grayscale image and how nodes updated the GLSL 3D view in realtime.

Here is the image:
http://img209.imageshack.us/img209/2202/normalpaintscreenshot01ry1.th.png

In this image, the normal output of the image is being mapped to the color of the object, painted in realtime. Since it’s not tangent-space normal output it looks green and magenta. I believe that this new “Normal map node” will do what’s necesary to make it right, and maybe realtime too.

Back then, I used a graphicall build that had some additional features for GLSL, but maybe this functionality is included in 2.48a.

In this page of the thread (post #129) there are some comments about what I did in this test, and a link to a build in Graphicall that I think has all the GLSL functionality of the build I used:
http://blenderartists.org/forum/showthread.php?t=137830&page=7

Hope it helps… and thanks for this node is a great step for adding detail.

I don’t know if it’s just me not knowing how to properly use this thing or if just isn’t working right but it adds a whole lot of noise to the output.

Been playing around with a UV mapped cube and when I try a middle gray texture on it all I get is a bunch of static when one would think it would just produce a flat cube.

Not sure what’s going on, maybe someone else could figure out how it’s supposed to work.

Hey eclectiel at the time I think I didn’t realize that was a loophole for the updating,
I must have thought you were proposing a new vertex paint bump map, and not painting live :smiley:
That’s kind of an old version last september, but it has a revision list
so you could browse the older revisions SVN and see if you could get the code for that node,
and see if it still works in the latest SVN :smiley:
Plus eclectiel was using an older version he says, somebody probably
trampled that node for the build that’s up now :smiley:
Good luck! That would be pretty sweet and worthwhile a no graphics card bump painting

I’m confused, it’s a mystery that the code snippet that was posted could work at all? Are the images in the first post actual results from this code?

The material/textures nodes don’t work with CompBuf’s, they don’t get a full buffer with all pixels but rather single colors for a given point on the surface. Here’s two ways you could go about implementing a normal map node (or blur node) for materials:

  • Build it into the texture node. That means for image textures you do get access to the full buffer, and can sample surrounding pixels. For procedurals a similar trick could be done. This means it is limited to a single texture and not as flexible, but relatively simple.

  • Do it as a general node. This is considerably harder as the node evaluation would need to be adapted. Basically it means you would have to run all the preceding nodes not only for the current shading point, but also some points nearby. The location of those nearby points would be distributed around the normal. Evaluating nodes multiple times to obtain those points in a normal map or blur node requires modifications to the control flow of shading node evaluation.

Yea, is an old version, before 2.48 was released. It had more functionality for GLSL, like soft shadows. But maybe 2.48 has everything that’s needed for this new Normal map node Uncle Entity is working on, to work in realtime. Soft shadows were not included in 2.48, but maybe the realtime part was not an exclusive functionality of that build.

Just to clearify… the nodes used in the example are Blender actual nodes. The one with the image is just a Texture node from the materials nodes.

Yeah, the more I play with it the more I see how it isn’t working.

I’ll see what I can do to port it over to a texture node – that was the original plan but looked a lot harder so I went for quick and easy.

It does kind of work in a crazy pseudo-artistic way though.

I think all the glsl code (texture painting branch maybe) was merged into trunk after 2.48 was released so the only way to get it is to run a svn build. If I ever get this thing working I’ll add it to the svn build I’ve been making that has the texture painting and all that. Don’t do windows though.

I didn’t think to check the input for the texture node :frowning:
It’s working in the 17349 I’ve been using, that’s pretty cool!
It really has a bump feel to it even without any lighting changes.
It’s not overwriting the second texture file though, I probably missed something.
http://www.box.net/shared/yb160ajnfv
I can tell you if it has to be through texture nodes that it was a personal nightmare for me
trying to get the normal map converter hooked into there,
because some of the other CMP filter files worked, so it teased me a little. :smiley:
I guess for a while Ffr was saying filter class tex nodes were impossible, but he since worked in filter class.
I’ve sworn off building :smiley:

I’ve hit a wall with trying to get this working as a texture node without a full buffer being passed like compositing nodes.

Texture nodes just segfault when trying the compbuf trick that (sort of) worked before.

A bit too complicated for me it seems.

The only other option I can see is to add a conversion function into imbuf (or somewhere) and call it from a modified texture image node but that seems to defeat the purpose when you could just convert an image in the gimp and be done with it.

Not that there really was a purpose to begin with other than a mild curiosity about how nodes work.

It’s pretty brutal, I wasn’t sure if TEX nodes would need a completely new buffer like TEX buf,
I spent a couple of days trying to get compbuf to work with tex nodes.
So does the Eclectiel setup work with the shader? You said it had some artifacts,
but maybe you need to adjust the bump scale? That might be easier,
but perhaps it’s converting to 8 bit with the shader

What I think it was doing was using an empty compbuf and averaging the surrounding NULL values with the one valid one supplied resulting in a mostly black image. Somehow, it isn’t very clear exactly what was going on.

The more I dig in the more I understand brecht’s amazement that it actually worked.

Now that I figured out how the existing normal map node works I think it would be relatively easy to add that code into a TEX_image node and have it spit out a normalized(?) pixel every time blender tells it to do its thing.

Still trying to figure out how imbuf stores its pixel data, 32 bit float or an array of rgba values…

I’m also wondering if it is possible to convert an image to a normal map without sampling the surrounding pixels because that would be super, super easy. All I would need is the correct algorithm since I have a ‘working’ texture node that just doesn’t do the conversion part right.