I’ve taken some HDR environment maps from http://www.debevec.org/Probes/ and would like to use them in my Blender renders. The page describes the mapping as follows:
Thus, if we consider the images to be normalized to have coordinates u=[-1,1], v=[-1,1], we have theta=atan2(v,u), phi=pisqrt(uu+vv). The unit vector pointing in the corresponding direction is obtained by rotating (0,0,-1) by phi degrees around the y (up) axis and then theta degrees around the -z (forward) axis. If for a direction vector in the world (Dx, Dy, Dz), the corresponding (u,v) coordinate in the light probe image is (Dxr,Dy*r) where r=(1/pi)*acos(Dz)/sqrt(Dx^2 + Dy^2).
Unfortunately I can’t work out how to do this in Blender. I’ve done the Noob2Pro skybox tute and understand that so tried putting a big sphere and putting the light probe texture on it. I then played around with the Map Input tab to try to get the light probe mapped correctly onto the sphere. Failed What mapping settings do I need to use? Many TIA.