But I’m stuck on only one point at 6m10s into the video.
He sets up an AngMap using an HDR image to create a reflection in the ball. However, when I do it using his settings, the AngMap comes out looking like at Spin Zoom Blur in Photoshop, and it renders this way too (see files attached, although the .blend file wont upload for some reason).
My questions are:
1.) What is an HDR image? Is it some kind of panoramic image or something?
2.) Can any extension be used for a world map?
2.) Why is the AngMap looking like a spin zoom? When I set the Mapping to “View” or “Global” - contrary to the AngMap directive expressed in the tutorial - it turns out looking right, but when I set it to the AngMap (as instructed by the video) it turns out looking like a spin zoom? Does anyone know why this may be?
3.) What is the difference between this world texture and an Environment Map? Or am I essentially doing the same thing?
Thank you for any help here!!! This is a very cool effect.
In image processingcomputer graphics and photography high dynamic range imaging (HDRI or just HDR) is a set of techniques that allow a greater dynamic range of between the lightest and darkest areas of an image than current standard digital imaging techniques or photographic methods. This wide dynamic range allows HDR images to more accurately represent the range of intensity levels found in real scenes, ranging from direct sunlight to faint starlight, and is often captured by way of a plurality of differently exposed pictures of the same subject matter
(From wikipedia)
Question 2: Yes, I believe so
As for your other two questions, I don’t know. Maybe checking out this tutorial, Andrew Price uses HDRI in a similar to the way you are using it, which is different to how JW used it.
The problem is that the term “HDR image” has been mis-applied to images that are not high dynamic range, but are taken in a manner that allows them to be properly projected onto the World “sky-dome” without major distortion. They should more properly be called “spherical projection” images. If they are also high dynamic range, then the HDR would apply. With reservations (see below).
The reason regular flat images look weird as AngMaps is because they were not produced using spherical projection, so they are smeared all over the virtual “dome” of the sky – a flat image spread out into a 3D hemisphere isn’t going to look right!
Here’s a useful little factoid, though. Unless you want the AngMap to be the visibly-rendered sky of your final image, it doesn’t need to be a spherical projection, and it can still be used for Environmental Lighting of your scene (what is too often called HDR lighting when it is not). That smeared-out weirdness can still be sampled by Blender to provide the Env. Lighting values. I do it all the time. But you have to replace the sky with something that does render looking good – could be a billboard-style backdrop (flat) with a shadeless sky texture, or some other method. Getting the World Sky texture out of the rendering can be done a number of ways, but it will still act as a source of Env. Lighting.
Another factoid: Blender cannot, afaik, use the full dynamic range of a true HDR image, it’s not set up to read than wide a gamut, nor to render one. I’m more than willing to stand corrected on this, but I could find no info and no controls that would allow full exploitation of a true HDR image.
Correction: Radiance HDR is one of the output format options, but I’m not entirely sure how that is accomplished in Blender, or whether a full HDR value range is rendered. Looks like something to test out.