I am using the built-in Texture Baker. I have a cube with a single material on all faces. The material has a single (procedural white and grey cloud) texture. If I set the texture’s input mapping to Glob, Object, or Orco, the baker seems to work beautifully. If I set it to Nor, I get a flat grey image that looks like there is no texture at all.
I also tried adding a second texture before the first that affects the normals (with Warp on and a factor of 1.0) and still just a flat grey image. Normal camera renders look great (with or without the material set to Shadeless)! Is there any way to get the baker to actually USE vector normals? I’d really like to get this to work in order to shade a good silvery-looking object for my game (and BRayBaker just doesn’t work at ALL!).
even if it did work as you expected, the in game texture would be much better if you did that mapping yourself… as in find/write a glsl shader to lookup your reflection texture based on the surface normal
the way the texture bake script works is to create a mesh with two vertex keys, the first is your mesh as you know it, the second is your mesh unfolded like the uv coordinates. The texture bake script makes NO changes to the surface normals, so the unfolded version is entirely flat and the normals indicate this.
However, glob, or orco textures will use their mapping from the original pose of the mesh, and be unfolded with the rest of the mesh. I don’t think Object mapping should work that way, but it’s something I never actually tried.
BRayBaker doesn’t unfold the mesh, but renders each face with a camera positioned perpendicular to the face… then it merges all the peices into one. Any texture depending on normals will not match at the seams, the braybaker is much better for diffuse textures [which are nearly always mapped using orco or global coords]
Oh, I know. Thing is, this is actually for a skin in SecondLife, where you cannot write any kind of rendering code. What I am doing is creating a hack. The object will be textured based on a static lighting setup I am creating in Blender. The BRayBaker would be fine for this, but since I cannot get it to work either in version 3 or 3.4 (I think it screws up the order of the face vertices or something; I’ve turned off ray reflections and all specular highlights for my object and it still creates a jumbled hodge-podge of unmatched polys, and believe me I have tried this with every stinking combination of settings imaginable), I’ve decided to fake it by using the vertex normals to decide which directions, “have lights,” and which do not. Since I am just going for a silvery effect in a world with lots of lights and other objects, a noise function based on surface normals works fine for me. But I DO want the, “lighting,” to depend on the normal direction and not the face’s position. The latter doesn’t look very reflective at all.
the way the texture bake script works is to create a mesh with two vertex keys, the first is your mesh as you know it, the second is your mesh unfolded like the uv coordinates. The texture bake script makes NO changes to the surface normals, so the unfolded version is entirely flat and the normals indicate this.
However, glob, or orco textures will use their mapping from the original pose of the mesh, and be unfolded with the rest of the mesh. I don’t think Object mapping should work that way, but it’s something I never actually tried.
Right. I actually already have the full UV mapping. I was hoping that since the Texture Baker looks up the global and object coordinates from the original mesh, it would do the same for the vertex normals. I guess this isn’t the case though. Bummer!
BRayBaker doesn’t unfold the mesh, but renders each face with a camera positioned perpendicular to the face… then it merges all the peices into one. Any texture depending on normals will not match at the seams, the braybaker is much better for diffuse textures [which are nearly always mapped using orco or global coords]
Yeah. Most, but not my particular application.
I am actually considering writing my own ray baker. It would use a set of prioritized camera positions instead of repositioning for every single face, and would probably use diffuse lighting only when none of the camera positions is directly tracable. Unfortunately I know nothing about Python or the Blender API, so it would probably be a stand-alone tool (and I will have to snap in my own procedural textures, though I think I have some Improved Perlin Noise code I wrote sitting around somewhere…). I’m just trying to be lazy and find something that already works. I’m a little appalled at how difficult it is being to find.
Here’s some more info and some pictures (re-posted from a non-public forum):
I either need something that will render reflections from a single (or multiple prioritized) viewpoints, or I need to be able to fake it by using the vector normals as the texture coordinates (which SHOULDN’T depend on camera position as far as I know; certainly two normal renders seem to show the same colors at the same mesh positions no matter where I put the camera). It seems that neither BRayBaker nor the built-in Texture Baker can do this.
I will show you. Here is a normal Blender render before baking (just the head is easier for me to demo ATM):
(Click to enlarge; again for full size depending on your screen resolution.)
Note that this is NOT done using actual reflections or specular highlights: both are disabled/set to zero in the material! It is PURELY a procedural texture based on vertex normal coordinates. If it is done based on object/global positional coordinates instead, it doesn’t look nearly so, “reflective” (this one looks even better with decent OSA, but I’m just demoing the setup and the problem here). Here is another pre-bake render to demonstrate the fact that the highlights are on the same part of the mesh (at least it looks like it to me!) despite a significant difference in camera position (thus BRayBaker should THEORETICALLY work):
(Click to enlarge; again for full size depending on your screen resolution)
Here is the texture that results from a run of BRayBaker (3 and 3.4 both give the same result, and yes I am using Blender 2.42):
(Click to enlarge; again for full size depending on your screen resolution.)
Crazy huh? I can reduce/eliminate the blue background issues with OSA (and touch it up later for the rest), but the issue of the faces never matching up never goes away no matter how I fiddle with the material, the lighting, the render settings, the size of the tiles, etc.). Here is a Blender render after the bake:
(Click to enlarge; again for full size depending on your screen resolution.)
I would post the bake from the built-in Texture Baker too, but TRUST ME it is totally uninteresting: literally the flat grey color of the material without textures.
normal cordinates are based on the normals relative to the screen
when unfolding the mesh to be flat you’ll get flat normals [included bake script]
or
when moving the camera for each face on the mesh, the normals will not match between faces, resulting in horrible seams [ray bake]
in other words, there’s NO WAY to bake textures mapped using NOR coordinates. The best you can hope for is a way to bake normals into some other coordinates [like uv coordinates], and bake the result of that.
Ack! What a strange way to interpret normals. Okay, well thanks for the info. I guess I’ll be writing a reasonable ray-tracer then.
Or I suppose I could write an outside utility to do the objective normal mapping as you say. That way I can still use Blender’s procedural textures. Hmm…