I’m trying to use blender to bake/render a skybox image and UVMap for use in an external game. I construct a scene, stick a cube in the middle, give it an “Environment Map” texture, render the scene, then save the environment map. This all works fine, similar to the tutorial here: https://www.youtube.com/watch?v=QLPYOn7RS8Y
However, when this is done, I don’t have a shape and UVMap to load into my engine which matches blender’s output.
The demos I can find suggest just “cutting apart” the environment map generated by blender, but I’m finding it hard to figure out the right UV coordinates. Apparently blender changed the cube environment map UV layout around 2.4x. Allegedly in 2.6/2.7 it is supposed to be…
| -X | -Y | +X |
| -Z | +Z | +Y |
However, this doesn’t explain what the orientation of each patch is. To prevent lots of mucking around, I’d like to just EXPORT the UV map Blender is using for the cube environment map. For example, by just exporting the cube-shape as wavefront-obj with a UVMap and the environment-map texture. However, I don’t see how to do this… Is there a way to tell blender to save the UVMap it uses for the environment map generation?
I’m also open to an entirely different process, avoiding the use of this “environment map” texture feature. For example, I tried to supply the UVMap for the cube, and use render->bake instead, but the results were not at all what I expected.