Can a polygon have a texture on one side of it, and a different texture on the other side? If so, how is this exactly represented in the Face structures deep in the Blender data structures? I know that it would be in the FaceFlags, but beyond that, where are the uv coords stored? In addition, are the face normals stored anywhere, or are they always computed at run time? I ask because a python script I’m writing would like to emit the face normals into the data file. Must I compute them manually given the vertex ordering of the face?
Only in the game engine (I wish I were wrong though). Faces in the game engine are by default only visible from one side, and as such you can have different textures on a face pointing the other direction (different normals) that doesn’t interfere with your original. I don’t know of a way single sided faces can be done during rendering. People have suggested pressing the no V normals flip button in the edit buttons, but it seems to only make things black (you may be able to have another copy of your object with a zoffset and no v normal flip, but I haven’t tried).
flags for faces are stored in the NMFace objects
nmeshobject.faces.flags kind of thing
(where nmeshobject is an nmesh object, and 0 is some face index)
I think there is a nmeshobject.faces.mode and .transp or something too
(the flag values are in Blender.NMesh.Const or I forget, they are or-ed together)
the uv coordinates are stored in the faces too
nmeshobject.faces.uv coresponds to nmeshobject.faces.v
vertex normals are the same
face normals have to be calculated from the verticies. For triangles it would be a normalized cross product of the vectors from v to v and from v to v