Firstly, I’m not talking about Camera Mapping (aka sticky UVs). I’m talking about basically the opposite. Instead of projecting the UV map from the camera view, I want to unwrap normally and then project an image onto the geometry from the perspective of the camera.
My objective is photogrammetry. I can get the geometry pretty easily, but I’d like to project the photograph from the perspective from which it was taken back onto the geometry, as a texture. Then do this for each camera and combine the results into something nice. My main obstacle right now is the texture projection.
Is this possible with Blender? What do you guys think?