I’ve perused with the search engine and can’t find much to help.
I’m making HDR angle maps for blender image based lighting, so they can be either EXR or HDR.
I’m building world images in Terragen, which can only export EXR cube-faces via a cube-map batch utility.
I’m haivng problems get them from one to the other.
My search has found mostly some elaborate tutorials on using blender to re-render a model using the individual images, then export the render. Not very useful for a production flow unless you are good with scripting, which I isn’t.
I’m used to using HDRShop but it won’t read EXR files. (not the free one). I’ve tried using Blender as an EXR to HDR converter, at which point I can then use HDRShop to merge them into my final HDR anglemap. (although I think I just found out that anglemaps and mirrored balls are not identical: the angmap format actually stretches the texture inward near the perimeter to improve the mapping quality near the seam.
But that is a royal pain, and Blender would be happy with an EXR angmap directly.
My question then is: where do (HDR) angmaps come from? accurate ones, not photoshop filtered ones. There must be a lightprobe-making utility somewhere that handles EXR. (free, or fairly cheap) I can’t believe that HDRShop is the only utility on the planet that does this sort of thing.
(If it weren’t so expensive, I’d buy V2 of HDRShop: the panoramic transform tools alone, plus the overal HDR focus, has made it a workhorse for me.