Normal mapping texture baker?

Hey I love the awesome texture baker script now bundled with blender. On that note I downloaded Blender 2.36 RC2 for OS 10.3 and was playing with the incredible new normal mapping functionality (http://www.blender3d.org/cms/Normal_Maps.491.0.html).

Anyway the only method documented so far for making normal maps is rendering them off with an ortho camera. This works fine if you’re mapping to a flat/semi-flat object. Now say I want to do a more complex character model. Ideally I could create my normal map by creating my uvs and using the texture baker script. However, when I try to bake a normal map material I just get a blank render. Maybe it’s because the procedural is “Map Input” via “Normal”? If anyone have any ideas let me know.

Also if the normal map could be baked after applying all possible displace maps that would be sweet!

bydesign

looks like I’ll have to play with that

but to make normal maps from a highpoly mesh onto a lowpoly one, look at nvidia’s melody tool, or the program ORB
http://developer.nvidia.com/object/melody_home.html
http://www.soclab.bth.se/practices/orb.html

… or you could use zbrush

currently there isn’t a way [in blender] to bake normal maps, well, except from procedural textures:
http://mywebpage.netscape.com/YinYangEvilSane/BlenderStuff/suzanne_baked_normalmap.png
http://mywebpage.netscape.com/YinYangEvilSane/BlenderStuff/suzanne_baked_normalmap2.png

this was done with my texture baking script, and material very much like one in the tut there

I believe I have the material [or a very similar one] and the script in:
http://www.geocities.com/z3r0_d/files/uvunfold_example2.zip [copy/paste the url]

oh, and as some people have noticed, at texture seams the normals aren’t correct, resulting in a visible seam when viewing the normal map. Solving this isn’t trivial [and as baking procedural textures to normal maps isn’t the intention of the script, I am spending my time expanding it elsewhere]

EDIT!!!
it appears the behavior for normal maps has changed since the last release [my previous materials are inverted now]

it also appears this change is correct, an image is no longer upside down if mapped to the normals [default settings otherwise]

I’d like confirmation on this… oh well
[guess I’ll have to update that .blend for 2.36 soon]

Try this link :
https://blenderartists.org/forum/viewtopic.php?t=26189&highlight=flippyneck
a solution perhaps …

You can try using the script that jms links to. I did eventually abandon that script though. You’ll find a simpler tool that encodes the vertex/face normals in the vertex colours in this thread:

https://blenderartists.org/forum/viewtopic.php?t=26454&highlight=flippyneck

:-? I don’t understand this blender has had nor maps for ages.

normal maps are different from bump maps…

bump maps uses derivatives of gradients to obtain the normal, so for one normal you need at least four pixels I think… that’s the nabla setting you see in blender’s procedurals. The larger, the blurrier will e the bump.

In normal mapping, each pixel codes for a normal direction using the RGB channels. No derivative needed => incredibly crisp maps!

But, google is a better friend than I am :wink:

Ciao
Dani

Could this be used to bake a texture for a low poly model from a high poly?

I mean could you:

Model a high poly mesh
Apply materials etc. to the high poly to make it all pretty
Model a low poly mesh around it
uv map the low poly
Bake the texture from the high poly mesh onto the low poly uv map

Think that could be done? It is basically normal mapping, but putting texture colour on the uv map instead of the angle of the normals.

it can be done

it can’t be done well in blender, if you want to use your uvmapping I’d suggest you use ORB [I linked to it before, the second link]

otherwise [and for genreally better results, but no diffuse and other map baking iirc] use melody [the first link]

Are you sure ORB can bake textures? I’ve used it to bake normal maps for low poly models from high poly, but didn’t see anywhere in the documents about baking the texture from the high poly to the low.

from what I gathered it could bake a texture on a high resolution model to a low resolution one, but I haven’t tried

so, I guess the workflow [as evil as it may sound] would be to bake procedural textures on the highpoly model, then use orb to bake normal maps and take that texture and apply it to the low resolution model.

I’ll have to try it…

To bake the texture to the high poly does that mean you would need to uv unwrap it too?

yep

shhh, I’m working on a script to do that:

OOPS, I bumped a very old thread again :expressionless:

BTW macouno’s BRayBaker will bake hipoly to low poly

and it will apparently, from what I can tell, bake a tangent space normal map as well, using the material file provided at blender.org’s 2.36 release notes…

…and it does a more seamless job than MeLODy or ORB! I speak from experience of using all 3.

currently BRayBakers hipoly to lowpoly only works with 2.37a, in 2.4 something was changed, and it does not map out correctly, (allthough it can still be used to bake in AO and procedural textures, and basically anything else… really an amazing script)

NeOmega,I can’t understand how tangent space can work.Can you post some examples?For truly work tangent space normal map must be able to support armature deformation for example.The material from 2.36 is dependent to camera view,maybe I miss something but for me It can’t work correctly.

It can’t work as far as being rendered or used in Blender, however the maps can be created, and then used in other engines.

…at least I hope I am right, because I have been sayign this alot recently… but the maps produced by BRayBaker using the material I mentioned, (the one in the mini tut given in the 2.36 release for object space normal maps) look very much like tangent space normal maps. Its the way the script works, he has the camera aligned perpendicualr to each individual faces normal, and then takes a render, and since blenders shaders are camera dependant, this ends up creatign a tangent space normal map.

So what I am saying is the UVmap image it self can be created with the script, but Blender would not render am image or animation using that map correctly… I don’t think, it might, I don’t know, Honestly I never tested it, allthough I could I suppose… maybe on this next model I am working on, which is an infantry man. Honestly I don’t see why it would not work, but I am not thinking that part of Blender through well.

But here is a sample of something I rendered in Blender with BRayBaker:

http://img189.imageshack.us/img189/9149/osnm00185wl.th.jpg

Note, this looks VERY MUCH SO like a tangent space normal map. I’ll run some tests tonight, to do an animation with a light and camera circling the actual mapped object, to see if Blender renders it correctly. I have no reason to believe, however, that TSNM capable engines would not render it correctly.

Yes,maybe you are right,if they are perpendicolar to the camera the other two directions should be tangent,.It’s interesting,the problems is that (like you have previously said)the image generated don’t work if rendered in Blender,this is the most important part,because you can use other applications to generate the map.but how we render it?

You can use Blender to generate the map. But really, usually these are used for game engines anyways, and Blenders game engine… well… it has never had a professional title, we’ll leave it at that. :stuck_out_tongue:

You cannot, (at least I have not tested it yet, but in theory, a tangent space normal map is just a bunch of object space maps mapped per face, and blender can render those) You cannot have blender use the TSNM and render the details pertained within in it correctly.

Yes,but without a good displacement mapping normal maps could be useful even for the internal renderer,not only for the game engine.
The ideal should be animate a relative low relosution model(not for game,much more defined)and render it with a result much more better(displacement or at least normal map)to capture tiny details very heavy to models.

shhh, I’m working on a script to do that:
http://www.geocities.com/z3r0_d/files/autouv_z3r0_mesh4.py.txt

Z3r0_d, i’ve been using your amazing script and have a little thing to ask for.
You let some steps between face groups, could be possible to let some steps in the borders to?

I’ll thank you very much!

I am going to start a thread with a mini tutorial

but here are the results:

http://img189.imageshack.us/img189/2294/vsnmrealtime7mu.th.gif

http://img71.imageshack.us/img71/6610/vsnmtest5ou.gif

I am confident this will work with any model… so the answer is yes.

However, were I to rotate the model or anything, and attempt to render an animation, it would not render correctly, (I tested this as well).

But Of course, you can still use Blender to render it in it’s original position, and get an idea of how your VSNM would look in an engine that supports it.