I’m writing an addon to make 2d puppet creation for Unity easier, and I can’t figure out the difference between textures in the viewport and in the rendered view. When I use my script (Source: https://github.com/jceipek/Blender-Unity-Addons/blob/master/Addons/merge_to_unity_puppet.py; Usage:https://github.com/jceipek/Blender-Unity-Addons/wiki/Merge-to-Unity-Puppet) to combine image planes generated by the ‘Import Images as Planes’ addon, they look white in the viewport and are textured with broken uv coordinates when rendered.
What’s the difference between textures in the viewport and the render view, and how does mapping work in Python? I’ve looked through the ‘Import Images as Planes’ script, and I can’t figure out why the textures it generates are mapped properly and show up in the viewport despite the MaterialTextureSlot.uv_layer being empty.
I also don’t understand why moving images in the UV editor changes the way those images are displayed in the viewport but not necessarily the render view.
Is there a high level overview of how all of this works? I’ve been finding the API really hard to use because it seems to assume prior knowledge of how things fit together.