Converting UV coordinates to a game engine's texture mapping schema

I’ve written an exporter for Blender that can export blender meshes to Natural Selection 2’s level format. ( )

Right now, the functionality is limited to geometry only. Textures are reset to default dev textures. I’d like to change that. :slight_smile:

Here’s the problem: the way Spark (NS2’s game engine is called “Spark”) handles textures is different from the UV coordinates that blender uses.

Here’s some quick background:

  • Natural Selection 2 is (an amazing game!!!) a hybrid RTS, FPS. Making custom maps is an extremely time consuming process because 1) the tools are primitive, and 2) the game itself is extremely complicated, and balancing the layout of the level to be fair to both teams is EXTREMELY difficult. The number of revisions required to make a “good” level is ridiculously high.
  • Spark levels consist of geometry, prefabs (props), and entities (player models, etc.). Props are created in 3d apps (like Blender! But usually 3dsmax) and use UV coordinates like normal. They cannot be edited in the editor. All you can do is position, rotate, and scale them. Props are sort of the icing on the cake. The majority of the level is built with geometry, which IS edited in the map editor.
  • As mentioned above, the map editor is a little bit primitive (small, indie developer, first title, can’t blame them :slight_smile: ) and I’m hoping to leverage the power of Blender to help accelerate the normally tedious process of NS2 level creation.
  • I have already created an exporter, but it lacks texturing functionality – one of the most time consuming aspects of NS2 mapping… the built-in texture tools are extremely spartan.

Converting blender’s texture coordinates to spark is proving difficult and I’m hoping somebody here has some ideas.

See Blender, of course, stores it’s texture coordinates as UV’s, where each vertex for each face has a 2d coordinate that corresponds to where it should lie on the texture. Easy, intuitive!

Spark on the other hand, being a game engine, is NOT like this of course. Where blender specifies materials per object, spark defines materials per FACE.
Each face in spark has:
-a material ID (integer corresponding to a string that stores the material’s name)
-an angle (float-value – spark textures are always mapped with respect to the face’s normal)
-an x and y offset (float values)
-x and y scales (float values)

Basically, spark projects the texture onto the face, using the normal angle of the face, with the texture rotated angle degrees, offset by the x and y offsets, and scaled by the x and y scales. The workflow I’m trying to create is that you would UV map as usual in Blender, and when exporting it, the script will try to come up with the spark texture coordinates that comes as close as possible to what it looked like in Blender.

So here’s the problem I’ve been wrestling with: how to translate the UV coordinates of Blender to this much more limited format? Let me put it another way: if I was moving from SPARK to blender, instead of the other way around, when imported, the UV’s would ALWAYS be shaped just like their 3d counterparts (eg a square would always look like a square, as opposed to say… a trapezoid.). Now… what if I have UV coordinates for a square face that look like a trapezoid? How do I convert THAT to Spark?

The biggest problem I’m having is figuring out how to approximate the “angle” value. Once I can figure that out, I think everything else won’t be as big of a problem.

Materials are per-face in Blender too, UVs are separately stored in loops (“face-vertex data”).

Is the rotation of textures relative to the object, or the world? And is it in degrees, radians, …?

The texture is mapped with respect to the world’s origin, the “angle” of the texture merely rotates the texture around the face’s normal vector.