As an optimization technique, blender only renders the faces of a model that you are facing. For example, when you are looking at a cube, you can only ever see some of the sides. The sides that are facing away from the camera(Which are concealed by other faces) will then not be rendered. To check if the camera should see the face blender uses the normals. One way to fix this is to duplicate all the faces then invert the normals, so that whenever the face is looking at the different faces it can always see the backfaces. The other way is to disable Backface Culling in the material options(Think this only works for glsl, though I could be wrong). This is prone to having funny lighting calculations though, but I think that was fixed recently :D.
Anyway, yes I do think that the camera scaling matters. You can see yourself actually. Try loading up the default blender cube scene. Change the Scaling orientation to Local(Normally its global, can be found near the Layers), then then press S, then press x twice, then press -1. You’ll see that the cubes lighting has gone a bit funny. If you then enable backface culling in the viewport(In the Shading part of the panel that comes up when you press N, it will go dark, because you are looking through the faces to the inside of the cube due to a strange inverted Zdepth bug. This only happens with negative scaling though. I’ll post a blend file in a sec actually, brb 
BTW:
If anything in here is wrong please correct me BGE masters 