No need to convert it to B/W manually - you can plug almost any color socket (yellow, grey, purple) into any other (with the exception of shaders [green]) and it’ll be “converted” automagically. A nice example of this is manipulating UVs like you would colours.
Also in terms of lighting with jpgs, it’s perfectly fine since you can just add some contrast. The problem of course is that you lose any detail in the >1 range so it usually looks pretty bad, especially in weak reflections like ceramics you’ll see a feint reflection of a big shape, rather than some nice shapes of what’s outside the window.
Well I didn’t know that. I thought ppl were calling the HDR just because it sounded cool. I didn’t think they were real HDR. Typically when I see a photo that is HDR it looks grungy and ultra real. I will have to look further into this, but that guy Brandon from the video I posted is crazy successful at compositing VFX and he uses a damn iphone to make his HDRI’s. They look beautiful too. I’m now not sure I understand all that an environment texture can do. I’ll check out your link.
Ok, but that’s practically physical global illumination.
The other obvious reason to go with a straight IBL setup without additional lamps is for a one off rig where you’re just trying to match the lighting to footage shot on set and don’t plan to reuse the rig later. In that case, just painting in an appropriate brightness for the light sources is certainly good enough.
There is a big difference between what is sometimes called the HDR look, that you are talking about and high dynamic range as it is used in the context of image based lighting. In the context of image based lighting HDR is really about accurately measuring light. Creating an image where the value of each pixel is proportional to the amount of light received at that point in the real world. Unlike your low dynamic range jpg, which can only record a tiny fraction of the dynamic range in a real world scene, an HDR image can potentially record the entire dynamic range. Plug that photographic record of real world lighting conditions in to a physically-based renderer and voila instant physically plausible lighting.
Yeah I didn’t realize HDR’s or the purposes of lighting a scene were different than HDR photos as a style. So in what format do you save HDR’s? I know how to shoot them but not really how they are compiled for blender. Do you guys use that filter in Ps for HDRs? What’s the process look like?
The tutorial at the start of this thread talks about formats a little bit and for more details hdrlabs.com is a great resource. The HDRI Handbook 2.0, which is written by the main contributor to HDRLabs goes in to even more detail and is pretty much the bible for this sort of thing. As for Photoshop, as much I like the HDRI Handbook, it would be about 200 pages shorter if the first sentence in the introduction was: “Don’t use Photoshop”. The book is absolutely full of hacks to kind of sort of get Photoshop to do what you want. Autodesk seems to have given up on developing Photoshop to support serious VFX work long ago and it is quite poorly suited for working with HDR images.
Finally got some time to catch up on the discussion.
First of all, thanks for making the sIBL loader for Blender. I think it’s tools like these which will make Blender a more attractive option to people like me who don’t come from a CG background but are looking at Blender as our professional needs evolve to integrate 3D. I’ve only briefly tried to play with the sIBL standard because so far render times haven’t been a problem (I’m only interested in still images at the moment), so I’ve simply used the higher resolution HDRIs instead of different resolution ones. But your point about using a light as a sun makes a lot of sense. I most likely will end up doing something like this, also because I will need a shadow pass as I point out at the end of the article. Do you know if it’s possible to create sIBLs on the Mac? I suppose I could always write the file by hand since it’s text-based, but I was hoping for a GUI tool.
As i’m getting more into Blender, your blog posts are always extremely informative and frequently show up in my google results. Thanks!
I talk about this and show an example of the difference between using a low dynamic range image and a real high dynamic range here. It kind of works with low dynamic range images, but as Greg mentioned there’s no image information in the blown highlights, so any reflections in darker materials don’t show any more details. That said, it depends on the image, sometimes it seems adequate to only use a low dynamic range and contrast it. Also, you can’t beat the convenience of an iPhone app to shoot a spherical panorama.
I think you meant ‘Adobe’ here
The only thing that I do is adjust the exposure (using the exposure function) in Photoshop, otherwise they just come out really dark by default because I underexpose more than overexpose to capture the sun. It’s not really a problem if they’re dark as all the information is still there, it’s just annoying to browse a bunch of dark photos. ‘Exposure’ simple moves the histogram to the right, so no information should be lost by doing so.