Image Size in BUs

If I use an image texture of 1024x1024px as an image texture using “Object” coordinates, what will be it’s size in Blender Units? What about an image texture of 2048x2048px? Does Blender consider the dpi of an image?

I need to import images with normalized dimensions that are very specific to the real world. The images may be different pixel dimensions and different real-world dimensions, but I need them to have the exact same real-world size in Blender. For instance, I may have a 1024x1024px image that is 12.9x15.6mm and a 2048x3072px image that is 13.1x21.5mm.

Any tips? I tried to look at this myself in Blender, but putting an image onto a plane with Object coordinates does not result in a way to read the exact dimensions of the texture, at least not that I know how to tell.

Thank you.

Tray ‘Import image as planes’ addon using dpi import option ? https://wiki.blender.org/index.php/Extensions:2.6/Py/Scripts/Add_Mesh/Planes_from_Images

but putting an image onto a plane with Object coordinates does not result in a way to read the exact dimensions of the texture, at least not that I know how to tell.
What does this mean ? What is important, the plane object dimensions or the image dimensions ? If you create a plane to the correct dimensions then what does it matter what resolution the image you apply to it

I didn’t know the addon had those options. That is a possibility. I’m generating the images from topography data that I scanned, so I will need to see what options I have for specifying the DPI. (EDIT: I have never used the addon, and I just realized that it would not work if it is simply using Generated or UV coordinates with a correctly sized plain.)

I need to be able to combine several image textures onto a single plane. So, I need the scale of the images to be independent of the scale of the object. Moreover, I need the image textures positioned by empties so that I can move them around in realtime and overlap them.

For this purpose, I will be able to simply scale the empties. For “Generated” coordinates, the image is stretched to 1x1BU, but I can’t position the textures using empties when using Generated coordinates. However, Object coordinates does not simply scale the image to 1x1BU.

Empties can be used when using the texture coordinates node, but I would also like to use the displacement modifier, which only uses empties for the “Object” texture coordinates open. If I absolutely must, I could position the textures dynamically using the color of a Diffuse shader, bake the image, and use that as a static displacement map. But, that is not ideal. And, that would require a lot of RAM to bake… It might not be a big issue, but I don’t know how large my textures will need to be to capture all the necessary topography detail.

EDIT: Nevermind. Only Object coordinates supports positioning a texture with an empty. The other options just ignore the object field in the node.

Okay, I figured it out. I thought that the Object texture coordinates preserved the proportions of the texture. Well, it somewhat preserves them. Generated coordinates maps the texture space from 0 to 1 in each direction of the bounding box, so the texture coordinates are distorted if the bounding box isn’t square. But, Object coordinates maps the texture space according the Object’s local coordinate system, which is only distorted by scaling the object. So, it won’t stretch the textures according to the mesh. HOWEVER, all the images are scaled to be 1x1BU before they are sampled using the texture coordinates. So, all I need to do is to scale the empties to the real-world dimensions of my images (relative to BU). EASY!