I’m trying to find the normal of a face within a cube (or any mesh) when in game engine. I need this so I can calculate the angle between two faces and from what I understand, I get that from the DOT product of the normal vectors of each face, right?
How do I do that in Game Engine? Thanks for your help!
The normal vector is “simply” the dot product of the edges of a face (but normalized).
It is assumed the face is flat. With this assumption the face normal equals the vertex normal of any vertex.
The order of the vertices gives the direction of the normal.
So you can do this:
A = 1->2
B = 2->3
Vertex normal VN = (A dot B) * 1/ |A dot B|
Face normal FN = VN
The above assumption is always true on triangles. Quads can be distorted.
Hi Monster, is it also true that the face normal is the average of the vertex normals of that face? Wouldn’t this be true for quads and tris (even n-gons)?
You can use mathutils to calculate the normal of the face using normals of vertices that make the face.
Thanks guys for your input and sorry for not being more specific…
I’m working on cognitive robotics and I’m trying to simulate a robotic arm reaching for an object in the environment. I need to “somehow” obtain the difference of angles between the hand of my robotic arm and the object I’m trying to grasp. So it seems that I need to compute the normal of the face of the object that I’m trying to grasp (which could be the main axis of the object but not necessarily), the normal of the face of the robotic hand… and finally calculate the angle difference between those 2 normals. Once I have that information I can start testing my algorithms on reaching&grasping for multi-link bodies.
So far I’ve seen that bge.types.KX_VertexProxy has a getNormals() function but it seems it works only for vertices, not faces and only in local coordinates… and since my target object will me moving and rotating all the time in the environment, I’d need a different way of obtaining the position/orientation of each of the vertices of a face in world coordinates so I can calculate the normals from the methods that you guys are suggesting.
Thanks for your help
If you just need the difference in normals between the robotic hand and the object contact places you can add Empties to robots fingertips and cast rays along the fingertip normal
If you want the normal of a quad or a triangle, I believe you can use mathutils.geometry.normal()
Thanks for the tip. I never noticed these functions and it works perfectly.
from mathutils import Vector, geometry def get_poly_data(object): def get_points(mesh, polygon, mat_index): points =  for poly_vert_index in range(polygon.getNumVertex()): mesh_vert_index = polygon.getVertexIndex(poly_vert_index) vertex = mesh.getVertex(mat_index, mesh_vert_index) points.append(vertex.XYZ) return points def get_center(points): return [Vector([sum(col) / len(col) for col in zip(*points)])] <b> def get_normal(points): return geometry.normal(*points)</b> mesh = object.meshes poly_data =  for poly_index in range(mesh.numPolygons): polygon = mesh.getPolygon(poly_index) mat_index = polygon.getMaterialIndex() points = get_points(mesh, polygon, mat_index) poly_position = get_center(points) poly_normal = get_normal(points) poly_data.append([polygon, poly_position, poly_normal]) return poly_data def print_normals_of_colliding_polygons(cont): collision = cont.sensors if not collision.positive: return own = cont.owner hit_object = collision.hitObject for object in own, hit_object: if 'poly_data' not in object: object['poly_data'] = get_poly_data(object) for polygon, _, poly_normal in object['poly_data']: if polygon.collide: print(object.name, poly_normal)
If you use a flat plane this is true as all vertex normals (normalized cross product over the adjacent edges) will be the same. The average is the same value too. This is very easy and efficient to calculate.
On a non-flat face (distorted n-gon) the normals are different depending where you they are on the surface. Calculating an “avarage” is one method to calculate the normal of a point on the surface. For example shading interpolates the normals between the vertices (smooth shading).
This does not sound like you need the face orientation. This sounds more like you need the relative alignment to each other.
The orientation of the object will be important if you need to grasp the object with a special “grip”. e.g. the Robot hand has to touch the exact left and right side of the object, but should not touch anything else (e.g on flat objects, handles on the object.)
I suggest to use the object orientation as a beginning. In a later step you can try to implement an algorithm that defines such an orientation. At the beginning you can simply take the orientation provided by the BGE (worldOrientation).
The relative alignment can be calculated by the orientations of the robot hand, and the locations of the objects.
I hope this helps somehow.
Thanks everybody for the discussion and suggestions.
I’m not sure how the rays work, if they detect objects in only one specific direction then it’s of not use since I need to detect the angle between faces of two different objects (that do not collide) at all times. But if a ray is like a line connecting both faces at all times and in all directions then this could be a pretty useful solution.
You are once more right Monster, for now and to start with something, the easiest solution is to use the BGE worldOrientation of both the hand and the object. I’ll do that, it’s very straightforward but in the end I’d like to work with positioning the gripper at specific places of a selected object.
Still it’s interesting why there is no face normal directly accessible from the API… I’m probably wrong but I think they are useful and since there is already access to vertices normals it shouldn’t be difficult to provide access to face normals, right?
In any case I feel this thread is solved. Thanks for all the help guys
Be aware that vertex normals are not the real normals. These are the normals as result for the shading. Which means they can be the avarage.
You can use this to manipulate the shading of a mesh. But this is used really rarely as far s I know.