Texture Baker/Compositor - mini Tutorial - new dog example!!

This was done with the script created by z3r0_d: he calls it the other other texture baker but i would call it a texture compositor


to use this script you make a model. here i have made a snake. We will focus on texturing the head. I made a duplicate of the mesh. The mesh that is purple on the head has the uv coordinates you see in the uv window above. I unwrapped with LSCM and moved them around a bit to make them comfy, see other tutorials(Greybeards is where i learned) for that info if you need to.

The snake on the right was textured with a different set of uv coordinates creating a good base to work from. See that layout in the window below.

here is a shot of what those uv’s look like and the original texture I am using to composite this new one. The texture lying here will be put on the set of uv coordinates in the above uv window.

then i use the script:

you need to set an ortho camera at 0,0,2 and in 2.37 set the scale to 4. Then you while holding shift select first the image on the right (the image textured how you want the composite to be) and then the one on the left(the one with comfy new uv’s). Now run the script.

You then set the material of the new object created with the script to shadeless, and render with the ortho camera to get your new “composited” texture. The original texture has been morphed to fit my ideal uv coordinates, the ones in the first image match this new texture. For the camera settings you need to fit all of those coordinates in the grey box in the uv editor.


You can now select the untextured object, apply the new texture with existing uv’s, and it will fit perfectly with no additional work.


You can combine this technique with other helpful UV scripts, bakers, etc. For example you can select this object and use the uv script - save uv face layout. If this is rendered at the same size as your composite texture then they will line up perfectly in your image editor. Now you can add details to your composite picture. I plan on repainting most of the scales on the head and adding more details like nostrils.

There are many uses for this script, this is just one of them. For example, model a head from a front and side reference pic. Texture the model from the front view. Create a composite texture. Texture the model with the side view. Create a composite texture. Blend the two composites together in your favorite image editor. Quick and dirty way to texture a head from photos. The other potential uses are limit-less…

I fixed a few typos and general errors in the above post. I will try and continue to do this as question arise and I find a way to make the description more simple.

I have also posted this project in WIP. I don’t think this qualifies as double posting because here I am just trying to explain the script (i couldnt’ decide whether to put this in python and plugins or here) . But if this angers any of you mods then please feel free to take action.



I really want to bring this animation idea to completion. I’ve been testing and goofing off with blender for to long not to have something I am finally proud of. Please support me in getting this finished! It will be an awesome 10-20 sec animation if my sketches can come to life…

can you explain a bit what this script ‘exactly’ does? i somehow don’t get behind it at all :expressionless:

in simple terms, it takes the uv coordinates of one mesh and creates a model based on those coordinates with the texturing of another set of coordinates.

the purple untextured head has the nice unwrapped coordinates that i eventually want to use for my final texture. the textured head is a rough approximation of how i want my final texture to be. the uv coordinates can be totally different but it will pull that rought texturing to your ideal set of coordinates. Look at the pictures, the roughly textured model is used a base for creating your final map.

what i have seen is that people are usually starting with a blank slate going into photoshop or whatever, or at best the texture outline of there unwrapping. This will give you a base to go off of… you can use many different textures and it will combine those into one if you want. Then go into your image editor and fix the seams, add detail, composite a couple of different composite textures, etc…

hopefully that helps a little. if you know nothing or very little about uv texturing in general then this whole thing probably makes very little sense.

i know my deal about uv-texturing to use them for game models but somehow i still don’t get what this tool does. does it now generate a set of texture coordinates or vertices for a mesh? i only do the save-uv-layout and then GIMP it over. thus from this perspective i do not get what exactly is going on there. i assume you kinda ‘render’ your final diffuse map or not?

does this help?


lol i am also very confused.

does this script change the image file and apply counter stretches/distrotions so that when the texture is applyed to the mesh no image distortions are visible?

otherwise i do not see the point here? all i do is unwrap a mesh and paint over the saved uv layout or unwrap the mesh over a scaned image so the mesh will fit the image. i did last for my m-16 magazine.


okay, I’ll try to explain

it allows you to change the uv mapping of an object, but keep the current texture

so, you texture an object so that it looks the way you like

but the uvs you generated are ugly, you don’t want to paint on them [example: you used blueprints]

then you duplicate the object, uvmap the duplicate how you like, and use the script to warp the texture you had originally applied to the new uv coordinates

the same can be done with macuno’s script, his even lets you modify the meshes [you can subsurf one for example]

I initally mentioned the script here:
and a couple days ago on blender.org:

here is a pic which shows my use for this script:

  • so the right most car is uvmapped well
  • the one left of it has blueprints applied
  • selected [make blueprint one active], run script generates mesh in center
  • render it creates image on left, which fits to good uv coordinates and has my blueprint texture

umm, my comments about ortho and lens 16 no longer apply, the “true ortho” means I need to find new numbers… just using a lens 16 and a proper position ought to work okay, or an ortho camera with size 4

also, my “other texture baker” script doesn’t have the uv seam fixing my other script has. [for more specifics on what I mean by that, check the pics I posted on the blender.org thread I mentioned]

ok… wait… the script does manipulate only the texture coordinates, right? but why is the texture image also manipulated? i mean if i look at the snake skin example the texture covers the entire texture area. after applying the script the texture coordinates seem to me beeing the same but the image is like ‘cut out’. this gives me the impression of the script creating a texture. :expressionless: … i’m confused… i guess this can’t be overseen :expressionless:

i believe what it does is to take the first UV texture, and changes it ( the texture ) to fit the UV coordinates of a duplicate object with different coordinates.

Modron: I think you nailed it in the fewest words possible.

z3r0 d: I was hoping you were around to chime in. This script should be an integreal part of the uv mapping toolset. It would be nice if it were modified to automatically set the camera, and had a very simple gui with a general suggestion, like Modron said. A render new texure with dimension fields and file name field would sweeten the deal. Turnng off sub-surface, making the new object shadeless, and a render button would complete the tool.

as far as i know there is no other way to transpose a texture in this manner

Another Example!

I built this dog head from these refrence pics of my dog. I then mapped some faces with the front portion and some with the side.


I then used this script to create a new texture based using the initial mapping and custom unwrapping of duplicate mesh.


I then took the image into photoshop to fix the seams and streaks. After that I applied it to the object with the matching uv’s of the texture.


does that help anyone?

SNNNNAAAPPP!!! Wowwowow! Looks like a great script! I use this type of tool quite a bit in 3dsmax. I can’t wait to try it out.

I get it! thanks for the script!

i see now better how it works. it is though like an inverse LSCM just that it doesn unwrap the uv-coordinates to match the texture but unwraps the texture to match the uv-coordinates. so far LSCM worked well for me.

i am glad that some “get it” now. i suck at explaining stuff sometimes, that’s why there are so many images.

Odjin: this is to be used with LSCM, not replacing it. It is just another tool in the toolbox expanding blenders functionality.

I have seen this type of procedure in other 3d apps as well. This really works best when you have good source images. It would work well for terrain if you had pictures from two or more know locations. Then you could use both pictures as your source texture and be able to move anywhere between them in an animation.

I also now think I understand what the script does after seeing the dog head example.

But, even so, I would dearly like to see a video tutorial showing exactly how you did this step by step. I’d like to try this script and technique out, but I am gettting confused when you are talking about image coordinates. That and using two images (side/front) into the same final image.

I’m sure if I watch someone do it, that has already been there, the ideas will all fall into place.