Light Probes Images in Blender 2.33 ??????

In the Blender 2.33 documentation on the page concerning Ambient Occlusion it is written that:

<< “Sky Texture” this performs a full render of sky color to be used as AO diffuse light.
Also note the new “AngMap” option in the texture channel for sky.
This allows usage of so-called Spherical Light Probe images. >>

questions :

  1. are the AO parameters active for “Blender Internal” Renderer Or only for “Yafray” external renderer.
  2. Blender should accept Spherical Probe Image Ok , for The Yafray external renderer it works fine, but with Blender Internal it does’nt work,does it mean that the “Blender Internal” doesn’t read HDRImages in anyway or can it read hdri in specific file format: like tiff, pfm or else…?

Does someone know exactly
in wich circomtances we are allowed to use <<so-called Spherical Light Probe>>
in Blender 2.33??

Regards Fmurr

To use and Ang Map in Blender internal renderer, you must use a JPG or TGA image file. [email protected] has a thread in Blender General forum, (do a search), that has these Ang Map files. You can also make your own from HDR files with HDRI Shop. It is a free application. Do a google search for it and you can DL it.

With that, you msut make the WORLD settings as follows:

Preview: REAL
Texture & Input: AngMap

Also, for Blender internal renderer, you can use the diffuse energy from these Ang Maps buy selecting SKY TEXTURE in the AO settings.

For use of HDRI images for Yafray, instead of loading the JPG image into the texture, load the hdr file and then use Yafray for renderer. You will need to use the Sky Dome setting to get true HDRI lighting though. For Pathlight, you will need to edit the XML file, AFAIK.

That I think answers your questions.



The problem is that I read Paul debevec’s lightProbes WEB pages
And according to him and other stuff about lightprobes,hdri,ibl and so on about lightning on the net :
tga and jpg files are not "High Dynamic Range Images"and even less a lightprobe image(wich is According to p.Debevec a 360° panoramic HDR IMage).
Thus,we can not say that we can use lightprobes with Blender Internal.

Regards,Fmurr %|

You are correct in tha statement. But, if you use the SKY TEXTURE setting as I stated, the lighting is taken from the Ang Map source. Not as a light probe, but as an energy value, if I understand correctly.


You are correct in tha statement. But, if you use the SKY TEXTURE setting as I stated, the lighting is taken from the Ang Map source. Not as a light probe, but as an energy value, if I understand correctly.

Thanks for your kind responses.
I’m doing test on that trick and i will see
how it goes.

Regards Fmurr

Lots of confusion again I see.

No, pathlight will also use the background light, no editing needed. I would think using the pathlight would be the first choice if you want true environment light interacting with the objects, with the skydome method you just basically get colored shadows.

In the end, HDRI is nothing more than an image format, no weird complex math behind it or anything. There is no difference in using a normal image and a HDRI in how lighting/shadowing is calculated, not yafray or Blender, the only difference is in the dynamic range.
Blender doesn’t use HDRI files, but the method is the same nevertheless. ‘Angular Map’ or AngMap is nothing more then another name for lightprobe. Which just means how the image is mapped, just a name like uv or sphere mapping. It is a full 360 (some only 180 though) picture mapped onto a sphere.

All ambient occlusion or the skydome method does when used with a background texture is, for every point, look around if anything is casting a shadow, if not use the background color as seen from that direction instead, that is all.

btw, you can use regular images with yafray too.

Also, I don’t consider myself an expert on this stuff despite having ‘played’ with this stuff for several years. So I’m sure my explanations could be better…, sorry if it all sounds even more confusing now… :-?

AH! Great then. It just seems that the pathlight option doesn’t work as well, IMO. It is a lot faster though.


You mean when using it with cache? In that case you probably mean that it tends to look blotchy? If so, this is because the cache only samples every now and then (the white dots you see in the ‘fake pass’ render). While the skydome (hemilight) or non-cache full GI method actually samples everywhere. The higher dynamic range exaggerates this problem even more, using a regular low dynamic range image tends to look smoother.
The problem is that yafray doesn’t know anything about the image, it just randomly samples it, so if the picture contains small ‘lightsources’, there is very little chance of hitting them. There are techniques to deal with this more intelligently, none of which are implemented in yafray yet.
A simple method often suggested is to use a blurred image for the lighting instead. But then shadows are also less defined.