While I’ve been working on my current video project, I’ve been trying to think of quick ways to recreate environments in 3D. I’ve not had time to research this yet, so in the meantime I wanted to ask if anyone has done this before, or knows if it is possible inside Blender.
Essentially, I thought if you had a raw scan of an environment, you could use a 360 degree panoramic image to camera map all your textures, and quickly approximate the details. If you could use an HDRI panorama, even better.
Tonight, I came across a link that shows how it is possible to do this in MARI: http://www.fxguide.com/fxguidetv/fxguidetv-165-scott-metzger-on-mari-and-hdr/
(The video is a little long, but keeps a good pace and doesn’t drag.)
Now, LIDAR scanners are still way too expensive to use, but you could use a Kinect to do this for low cost. I know there are a number of different programs out there for various applications. You could also do this with a photogrammetry program such as AGIsoft, but that is much more sensitive to reflective surfaces, whereas I believe that is less of an issue with the Kinect.
So, to sum up, is it at all possible to setup camera projected textures to be projected as a 360 degree panorama?