New Sudo Normal Mapping in BF Images

This is a Bump map from pre new.
http://www.aprilcolo.com/oh/plink/tri/Tuho%20bump%20world.jpg

And this is the new Sudo Normal Mapping.
http://www.aprilcolo.com/oh/plink/tri/BF%20world%20bump.jpg

Now I call it Sudo normal map because it just uses greyscale maps. While the normal maps on the internet are the RGB varaity.

And these are tests with the new stuff, sudo Normal map
http://www.aprilcolo.com/oh/plink/tri/Tuho.jpg
Displacement map
http://www.aprilcolo.com/oh/plink/tri/Tuhobug1.jpg
And a bug in displacment map
http://www.aprilcolo.com/oh/plink/tri/Tuhobug2.jpg

Could a windows user try out a model obj file in the ATI normal map extractor ?
http://www.ati.com/developer/indexsc.html

Or Nivida’s
http://developer.nvidia.com/object/ps_normalmapfilter.html
And post it here so others can test the RGB idea of a nomal map in this new blender.

Or even Orb
http://www.soclab.bth.se/practices/orb.html

Can’t blender already do normal mapping, just in the source it is disabled by default? (it is like changing two lines, to get the old method you use noRGB in the material buttons)

Yep ! You are right! I guess. But nobody will give me an answer.
https://blenderartists.org/forum/viewtopic.php?t=19685

Can you tell me if it is true ? But anyway I would still love to test an RGB map so it feels authentic.

I’m not sure what I’m looking at… are the first two pictures just a cloud texture on the sphere? Or has some other method been added now to Blender? If it is the previous, it is just bumpmapping, the old method faked this, and the current code does this more or less correctly.
About normal mapping, it really would be trivial to add this to Blender, the question is though if you would see any improvement. It constantly seems to be confused with displacement mapping.
The ATI and NVidia tools just create normalmaps from images (which is just the same as calculating normals at rendertime for bumpmapping). Not meshes, which is far more complicated (a bit similar to uv unwrapping). In effect there is no difference with bumpmapping. It is just a method to encode normals into an image, so that you basically don’t have to calculate the normals yourself, which is important for realtime applications like games.
The ORB program does create normalmaps from meshes though.

Yes it is just a cloud texture.

If it is trivial then I do not see why Blender should not have it. Just telling the 3D community the fact Blender now accepts and Creates RGB Normal maps would get even more attention for Blender and it’s cross compatibilty. As I for one am on OSX and there are nill zilch nada no Normal mapping programs for OSX that are free or ever low price. Really the only program for OSX is a plug in for the 1,600$ software Lightwave and Maya. That is all. And For Linux I have yet to see any either. And that ORB program is only for windows.

As for the use. It would be very useful. As I have said before. I see a work flow that entails the displacement mapping during render time would create a UV normal map of the meshes displaced data and bake it to the meshes texture.
The purpose of all of this would be for speed and detail.

The first thought of a normal map is usealy just to reduce a mesh that is around 3000 faces to 500 faces. Retaining the expansive details.

Now the second use for it would be If it is used on that 3000 faces mesh, this 3000 face mesh would gain even more details that would look like it is around 20,000 faces or more, in fine details.

So thus far, yes ! A RGB normal map would be a good feature to have. Now what the difference between the RGB and the Grey scale as I am not compleately sure. I would just guess that the Color of the RGB has a better messure of natures faveforms. Like energy has color in the amount of power in gives off. NOTE: I AM most likely wrong about that waveform part. But what I do know is that RGB looks much neater and new compaired to the antique greyscale images, as math in it’s true graphical form is very neat to look at. And style is in a class of it’s own.

Anyway. Please create the current popular RGB normal maps. It would make Blender evn more high class.

^v^

That sounds very powerful, I’m already playing with the displacement mapping, and its great but normal mapping would help a lot.

I think normal mapping would be another very big step for adding high levels of detail, I’m agree with you. :smiley:

Just for a note. I am going to do all of this tommorow

In source/blender/blenkernel/intern/image.c,
change:
Code:

if (tex->nor) return 3;
else return 1;

to

Code:
return 1;

at line 711 and line 1485.

It is not much and I have done it before. But like I said before. I need an obj with a normal map to truly test it. So if a windows user could test a file in ORB that would help a lot.

Just to clear this up:
Bump Map: A greyscale image that affects the lighting of a polygon.
Normal Map: A RGB image that affects the lighting of a polygon.
Displacement Map: A greyscale image that affects the geometry of a polygon.

All three techniques are used to add more detail to objects.

Bump/Normal mapping only affects the lighting of a polygon - you can see the illusion from certain angles.

Displacement mapping is much more CPU/memory intensive, since there are more polygons to render.

Internally, the Blender renderer uses normal maps. These can generated directly from stucci textures/plugins or derived from the bump map.

Personally, I think a bump map is easier for an end user to create/paint etc. conceptually.

It is easy enough to create a normal map from high resolution geometry. See: http://reblended.com/www/alien-xmp/Tutorials/NormalMap/NormalMap.html
You use a blend texture to capture the normal. The same technique can be used to create a bump map.

Normal mapping really only became interesting because newer GFX cards can do it in real time.

Oh hey ! the author of that code hack has appered ! . Can you tell me if anything has changed in the code that I should know before In test it ?

I think others are getting the impression that a Normal map is just pretty and blue. That is wrong. The three colors Yellow Blue Red are used for the three axis x y z. It is math but Like I said math in it’s true graphical form is pretty. :smiley:

And strong colors I think will contain more detail.
http://amber.rc.arizona.edu/lw/normalmaps.html

This one has all three colors brightly displayed. .

is there…
any chance in the world…
that you can code the noemal mappim to work with the game negine ???

this will bump blenders game engine a few major steps forward !!!

thanks,
Koby.

p.s
i can beg if it comes to that (:

Yeah the game engine would be a great target ! As that was what it was first created for.

I DID THE HACK> it built lickidy split. But now I need a windows user to create a normal map .

That code will just mean it only returns colors, it doesn’t do anything to get the normals from the image. Anyway, I just posted a small patch in the committers list for anyone who really wants to try this in Blender:

http://www.blender.org/pipermail/bf-committers/2004-January/005123.html

It only allows you to use normalmaps as bumpmaps for rendering, it doesn’t create them, for which you need an external program.
btw, I didn’t see it on the page from the link above, but the ATI ‘NormalMapper’ program actually does create normalmaps from meshes.

I don’t really think it is all that useful in non-realtime rendering as I said before (although renderman uses normalmaps, but not necessarily for bumpmapping), so I don’t really think this will be included in Blender, it is just for those who want to experiment with this themselves.

But now I need a windows user to create a normal map .

try here http://www.drone.org/tutorials/displacement_maps_air_objects/torsoObj.zip for the mesh and here http://www.drone.org/tutorials/rayDisplace_renderman.html for the normal and displacement maps, good luck :wink:

these are from a renderman tutorial on the subject not from me.

also if you get blender to handle normal maps the way lightwave and other non-micropolygon renderers do there is a lot of good uses for it, like LOD and just more controll over bumpmapping in general.

Swveet! eeshlo I will test it tommorow. And I am glad I finaly have an answer for that little bit of code.
Now who do I pesster to get the extraction part created ? As just making the useage is great ! But not being able to make the normal maps it’self is pretty much defeating it’self . But hey maybe you are working on that in secret :stuck_out_tongue:

And Slow67. Thankyou for thelink I will test it tommorow . I really need to get my box cleaned up so I can do test builds quicker. :smiley:

so is anybody else up to the next half of the puzzle ? ov0

There is only one use of normal map which is very interesting : Create an UV bounded color map and normal map from an high poly mesh (say 15k), then simplify the mesh to 1k vertices. The color and normal maps putted back on the simplified mesh give it a look nearer of the original. Great for games and web content as the mesh will render faster.
Great also for progressive meshes where number of vertices change with distance from camera for complex scenes.

For other uses, the normals are taken from the mesh and the normal map is not needed.

But, and that’s a big but, creating the simplified mesh is really a non trivial work, and that must be done from the maths. That means that adding normal maps to blender without first implement the progressive meshes is a no-go. You need also a good chart and unfold algorythm to make an atlas and normal UV mapping in automatic mode. the actual trend seems the LSMC one, and that’s also non-trival to do.

The latter is really missing to blender and so, althought normal maps is hype, there is more vital functions missing in blender to add before. We need better UV’s, we need better nurbs (it’s coming), we need better procedurals, we need ray-tracing (oups it’s done), we need 5 sided polys, we need more mesh tools, we need better booleans (badly), we may need true 3D geometry and CSG …

Now, in the game engine, it’s another story.

No way, as always I only do the easy stuff that I can get away with sounding like I know what I’m doing while I basically don’t have a clue myself… :wink:
As I said before, and which lukep explains much better than me above, creating normal maps from meshes is a lot harder. Definately beyond my almost nonexistent skills.
This was easy enough to do, I thought I’d just create a patch for those who want to experiment with some of the basics, although I’m not sure you can even call it that.
I don’t think you will be very much impressed by it as it is still just bumpmapping as I said.

No way, as always I only do the easy stuff that I can get away with sounding like I know what I’m doing while I basically don’t have a clue myself… :wink:
As I said before, and which lukep explains much better than me above, creating normal maps from meshes is a lot harder. Definately beyond my almost nonexistent skills.
This was easy enough to do, I thought I’d just create a patch for those who want to experiment with some of the basics, although I’m not sure you can even call it that.
I don’t think you will be very much impressed by it as it is still just bumpmapping as I said.[/quote]

Drat …

But one other thing. I do not understand why others do not see it as a good tool for animation work. It would reduce the poly count for super details that are not posible at this current time for medium computers. But the normal map would be placed on a high poly mesh so th emap would just add even more details.

As for the low poly version of the mesh. It could simply use the desimator feature already in blender. And an atlas or cube map could be used for the unmapping.

It’s not that simple. You need to have the new faces take exactly the same UV layout as the high poly faces, or it won’t work.

Martin

It’s not that simple. You need to have the new faces take exactly the same UV layout as the high poly faces, or it won’t work.

Martin[/quote]

Aww! Oh well. Maybe somebody in the future will try it. But for now I am out of ideas for this.

Creating a normal/bump map from a detail mesh to be applied to a low poly mesh is easy - see that url I posted before. (Creating a low poly model & normal map from a high detail model as lukep suggested is much harder.)

I’m still not really sure what makes normal maps better than bump maps.