I have been trying to figure out HDR with a moving camera for several months now…finally thanks to a bunch of thinkin (pacing in the kitchen while drying dishes) I have figured it out. There are probably some of you out there who are wondering this too. I’m just going to explain it quickly and you can easily figure the rest out from there. You’ll need lightprobe images from both sides of the probe, hence having a 360 degree capapbility. Now, here’s what threw me, I thought my lightprobe would give me a 180 degree reflection when it turned out to be about 270! After you have the pics go into blender and add a sphere (I prefer a UVsphere) then cut it in half, by seperating it with “p” now you have two half spheres. You want to texture the inside of it with one of the HDR images, just “unwrap it from view” then upload and match up your pic. I’m assuming that’s easy to figure out…oh, remember to calculate inside normals cause you want it textured in the inside. (ctrl+shift+n) ((in editmode))
then go to textured mode and sit INSIDE the spheres and match up the textures so it looks like one solid picture. Easier to understand then explain. You will need to do a lot of skewing (alt+s) to get the right HDR shape…here’s what it’ll come to look like. And PRESTO! you can now import you tracking data right into the camera and have it move around in a “3d” scene! Hope that makes sense!
As you can see in the pictures, the spheres skewed, and then lined up.
(I don’t know if this qualifies as a tutorial or what…but I put it here)
Looks good. Maybe you could bake those textures, join the objects, bake the texture again, and then photoshop out the seams:D
Yeah, I’ve heard of Baking textures…how exactly do I do that? (I’m not really that familar with normal maps and stuff that relate to it…