I was wondering if it would be possible to use photosphere images for enviromental mapping and it works great. I just went to where people share their photospheres and clicked on options and downloaded one and then set it up to work in cycles. Just a enviromental texture into the background world texture. Now all I need to do is get photosphere to run on my android 4.2.2 Still figuring it out… shouldnt be long though… This is a big step forward for fast vfx pipeline.
That seems like a cool idea, sadly limited dynamic range means it’s only good for textures not HDRI. I wonder how successfull shooting a spehere with different exposures would be, can a phone cam even do actual exposuer changes or are they all electronic gain tricks? I wonder if we can rectify the distortion to solve the camera with BLAM script? Then we could insert geometry easily.
actually I was playing around trying to get photosphere working on my galaxy tab2 7 and there is a hdr setting for taking different exposures… but I think that its just for a photo and not for the photosphere. But I am still trying to get photosphere working on my tab hopefully soon. The workflow I envisioned is shoot the background plate with a nice camera and then do quick enviroment capture with the tab for reflections and lighting. Then shoot the video and camera track it.