Must I change/multiply something with the localCoords Vector or something like that?
I played around a little bit and got different results but I prefer the correct way do to this.
So I hope someone can help me?
A quick walkthrough of things that I don’t know how to do:D
You should have normalmap and read RGB values from it. r = x, g = y, b = z directions.
Convert normalmap to tangent space(wasn’t it kinda cross product or doct product of surface normal and normalmap or else similar thing?)
Calculate lighting using the tangent space converted normalmap instead of surface normal…
And than you can start interesting on how to apply normalmap for reflections…
Oh, and remember that (0.5, 0.5, 1.0) normalmap color must equal surface normal direction.
…I use a GLSL shader (via Python) and the code for the normalmapping is already there (link above), but now I search for a way to have a variable to set the intensity for the normalmap…
I know that node already from parallax shader and I use it in my PBR. However, I am thinking of one thing - isn’t there a more effecient way to get binormal and tangent vectors than having seperate materials for each of them?
Yes , I have problems to unterstand what you mean.
I use the same code like in the link above but in my shader are some other things so I simple copy the code (with little fix because spotCosCutoff don’t work anymore) from the Link for a better overview
import bge
cont = bge.logic.getCurrentController()
VertexShader = """
attribute vec4 tangent;
varying mat3 localSurface2View; // mapping from
// local surface coordinates to view coordinates
varying vec4 texCoords; // texture coordinates
varying vec4 position; // position in view coordinates
void main()
{
// the signs and whether tangent is in localSurface2View[1]
// or localSurface2View[0] depends on the tangent
// attribute, texture coordinates, and the encoding
// of the normal map
// gl_NormalMatrix is precalculated inverse transpose of
// the gl_ModelViewMatrix; using this preserves data
// during non-uniform scaling of the mesh
// localSurface2View[1] is multiplied by the cross sign of
// the tangent, in tangent.w; this allows mirrored UVs
// (tangent.w is 1 when normal, -1 when mirrored)
localSurface2View[0] = normalize(gl_NormalMatrix
* tangent.xyz);
localSurface2View[2] =
normalize(gl_NormalMatrix * gl_Normal);
localSurface2View[1] = normalize(
cross(localSurface2View[2], localSurface2View[0])
* tangent.w);
texCoords = gl_MultiTexCoord0;
position = gl_ModelViewMatrix * gl_Vertex;
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
}
"""
FragmentShader = """
varying mat3 localSurface2View; // mapping from
// local surface coordinates to view coordinates
varying vec4 texCoords; // texture coordinates
varying vec4 position; // position in view coordinates
uniform sampler2D normalMap;
void main()
{
// in principle we have to normalize the columns of
// "localSurface2View" again; however, the potential
// problems are small since we use this matrix only
// to compute "normalDirection", which we normalize anyways
vec4 encodedNormal = texture2D(normalMap, vec2(texCoords));
vec3 localCoords =
normalize(vec3(2.0, 2.0, 1.0) * vec3(encodedNormal)
- vec3(1.0, 1.0, 0.0));
// constants depend on encoding
vec3 normalDirection =
normalize(localSurface2View * localCoords);
// Compute per-pixel Phong lighting with normalDirection
vec3 viewDirection = -normalize(vec3(position));
vec3 lightDirection;
float attenuation;
if (0.0 == gl_LightSource[0].position.w)
// directional light?
{
attenuation = 1.0; // no attenuation
lightDirection =
normalize(vec3(gl_LightSource[0].position));
}
else // point light or spotlight (or other kind of light)
{
vec3 positionToLightSource =
vec3(gl_LightSource[0].position - position);
float distance = length(positionToLightSource);
//attenuation = 1.0 / distance; // linear attenuation
attenuation = 1.0 / (1.0 + 0.1 * distance + 0.01 * pow(distance, 2.0));
lightDirection = normalize(positionToLightSource);
if (clampedCosine < cos(gl_LightSource[0].spotCutoff)) // spotlight?
{
float clampedCosine = max(0.0, dot(-lightDirection,
gl_LightSource[0].spotDirection));
if (clampedCosine < gl_LightSource[0].spotCosCutoff)
// outside of spotlight cone?
{
attenuation = 0.0;
}
else
{
attenuation = attenuation * pow(clampedCosine,
gl_LightSource[0].spotExponent);
}
}
}
vec3 ambientLighting = vec3(gl_LightModel.ambient)
* vec3(gl_FrontMaterial.emission);
vec3 diffuseReflection = attenuation
* vec3(gl_LightSource[0].diffuse)
* vec3(gl_FrontMaterial.emission)
* max(0.0, dot(normalDirection, lightDirection));
vec3 specularReflection;
if (dot(normalDirection, lightDirection) < 0.0)
// light source on the wrong side?
{
specularReflection = vec3(0.0, 0.0, 0.0);
// no specular reflection
}
else // light source on the right side
{
specularReflection = attenuation
* vec3(gl_LightSource[0].specular)
* vec3(gl_FrontMaterial.specular)
* pow(max(0.0, dot(reflect(-lightDirection,
normalDirection), viewDirection)),
gl_FrontMaterial.shininess);
}
gl_FragColor = vec4(ambientLighting + diffuseReflection
+ specularReflection, 1.0);
}
"""
mesh = cont.owner.meshes[0]
for mat in mesh.materials:
shader = mat.getShader()
if shader != None:
if not shader.isValid():
shader.setSource(VertexShader, FragmentShader, 1)
shader.setAttrib(bge.logic.SHD_TANGENT)
shader.setSampler('normalMap', 0)
Hi, I can’t give you a scientific answer but it produces nice effects when you multiply either localSurface2View[0] or localSurface2View[1] with a factor… http://www.pasteall.org/blend/39185
Now my question ist how to make a Intensity variable like in the bge material settings to set the normalmap intensity?
you do not
( well you can always do whatever you want but…)
the VALUES!!! of the rgb colors are relative distance vectors
if you change them you change the Cartesian locations
you ALSO change them if you reduce or enlarge a normal map
shrink one from 1024x1024 to 512x512 and the heights WILL DOUBLE
the same goes for enlarging a 512x512 to 1024x1024 the heights will be 1 / 2
if you Increase / decrease the RED value you CHANGE the X values
if you Increase / decrease the GREEN value you CHANGE the Y values
so if you have a test hill DEM , converted to a tangent Normal map
you can change the height by increasing the values PAST the 255 tone
but it will NOT be a 8 bit / layer image any more
– or you will end up with a plateau
or if you change the GAMA values you will just change the middle height values
normal maps are almost always the END PRODUCT
the very last thing
Okay, thanks for your explanation, it makes sense. But when normalmaps a the end product why blender have an intensity value in the texture settings? What is different to that what I want? Or do I understood something wrong?
Because as a last step Blender linearly interpolates the final normal map with the original vertex normal by a “intensity” factor using a version of the “mix” function that does the same:
Which, let’s say, ‘works’ but not really. Rather, taking the normal value after converting to tangent space and the N element of the TBN matrix and putting those two through a mix by intensity should actually be the way to go.
So once again kids, take notes from Blender source. It’s good for you.