Real quick question here, actually two questions… I’ve done a search for each of these, but no game.
First, has anyone here ever modeled in Blender and, complete with texturing and export, imported that model into Unreal Tournament for use as a static mesh and/or can anyone point to a good tutorial, or series of tutorials on how to do this? I’ve yet to find anything in my searching…
Second, are there any good tutorials on blending textures together on a single mesh via vertex blending? I’ve seen the tutorials on doing so with blending/alpha textures, but that’s not the approach I need to go with. Just cannot seem to find a good tutorial that explains the steps behind it…
I used to be able to do both, actually, but I long forgot anything I knew. It’s possible, though. Once I made some kind of… giant, under-ground cavern (static mesh) with tree roots coming out of the walls and stuff (static mesh) and monoliths with glowing runes carved on them (static mesh) and the floor was covered in dead grass and rocks (more meshes) It was pretty funky.
EDIT: The vertex colours blended the walls of the cavern from rock to more different rock, and stuff like that. So, yeah, it’s all possible. I might even have all my old tools and stuff on backup, somewheres.
I’ve made UVmapped but untetxrued statics and thrown random textures on them from the UT texture packs and it’s worked OK; I’ve been waiting for render baking to try anything further. You mainly need an export script like USMExport for the mesh; you’ll have to import any images seperatly in the UED browser thingie.
Wiki.beyondunreal.com has info on any of the importing stuff you need, and a (very, iirc) little info on working with blender.
Main thing is that I can’t seem to get an exporter that auto-scales in any useful fashon, so I end up having to scale up by some large number each time I export (the first thing I made was 6x6 in blender and needed to be 512x512, which required a 85.3333 factor . . . )
You also answered another concern I had (but didn’t ask about), which was the scale ratio from Blender to UT2k4. I would imagine there’s some Blender Unit -> UT2k4 Unit converter, or reference somewhere. Will have to look for that. If not, well… looks like it’s trial-end-error time :-).
So I’ll look into the vertex painting tut and then see if I can work out the rest from there.
If I understand correctly, the textures are almost incidental, since - as was noted - you can define/change the different skin textures inside the UT editor itself… so it’s really just a matter of having the UVs and the vertex blends set up in the exported static mesh. Would that be correct to assume?
Yeah, that’s pretty much how it goes. I think if you want multiple material slots in UED you have to assign sets of faces to different images in the UV editor, but you could just create a bunch of test patterns for that.
As far as scale goes, I’d suggest 1 blender unit for 64 UUs, since that’s something in the ballpark of a meter in Unreal and a nice basic unit. It also means 512 UUs is 8 blender units, and so on.
The thing that I’m not entirely sure about, though, is how that works. I mean, do I just use the standard Vertex Paint function in Blender, set a different color (black/white, etc) for each different texture and then export it? I assume the vertex color information is saved, via the Unreal exporter, and the Unreal engine/editor will automatically know how to handle it?
Obviously that’s something that could only be answered by someone who’s successfully gone through the process, but figured I’d put it out there :). I’m familiar with UV-Mapping, but vertex painting/blending is pretty much completely new to me.
Okay cool… so it knows how to recognize each material - that’s cool.
So, for applying the actual vertex shading so the textures actually blend smoothly into each other and aren’t divided by a hard line, is just a matter of painting the vertices a certain color and having the mode at the appropriate setting?