Yeah, I know it’s very simple. But hey, it’s my first HDRI render
you should put it on a plane (so you don’t see the funky background image) and also edit the materials so they are more interesting
good choice of the uffizi probe as it has a high resolution
8)
Yep, the uffizi probe gives great results.
And I agree about the plane. See what a difference it makes:
Also, you can press thge TGA Alpha button, so you will get no BG image at all displayed. You will jut get the ball.
Good work.
BgDM
Thanks for the tips
These are yet again one of those images that are really, really simple, but look amazingly great
I do not want to sound impolite, and I am really interested to know the answer.
What is amazing about these renders and HRDI in general?
I can see irregularity in lighting of the object (sphere, in this case). It is obviously produced by some “real” data, taken from “environment” around the object. But, since the environment is a randomly picked image, does it matter to the viewer that the irregularities are produced from the image that has no real meaning?
Lighting could be made irregular by applying some texture to the spots in the scene, instead.
Again, this is question, and description of my opinion up to the point of asking. I hope there is an explanation, which I will be happy to hear and which will change my opinion - I am prepared :
Emil - congratulations on your success in using HDRI. Looking forward to your next renders with it!
Regards,
Anton42
Well, HDRI is an amazing way of lighting a scene.
It is indeed produced by real data. The image is used to calculate the lighting in the scene, (don’t ask me specifically how, I’m no that technical ). The images are in what is called and hdr format which produces very high resolution images with varying specgtrums of light. A standard jpg or tga file does not contain this data. This is how the HDRI works and produces such realistic renders.
As far as the HDRI image having no meaning, you do not have to show the image reflecting in the objects in the scene. You can use standrad materials with textures and still use the HDRI file to light the scene and produce a great image.
Yes you can get irregularities with textures on your spot lights. However, you still do not get the range of lighting that the HDRI image generates.
Hope that explains some stuff for you.
BgDM
BgDM, thanks for the explanation.
But (how do I say this with all respect) I already knew most of what HDRI is, since I was carefully reading the links you (and some other people) have provided in various sections of the forums here.
My question is more related, probably, to your saying that:
> Yes you can get irregularities with textures on your spot lights.
> However, you still do not get the range of lighting that the HDRI
> image generates.
How is the “range of lighting” affecting the picture other than producing irregular light?
I think (and this is only and opinion, so you can and should disagree if I am wrong) HDRI allows setting lighting in a believable way much faster, than manually placing and tweaking lights. And, for the “real” project (whatever that is) someone might actually create their own .hdr image, to also produce realistic lighting…
So, if I am creating a global illumination sphere, and say, that a builing should be blocking it from a side, I could reduce the intensity of spots on that side, or model a building from that side to actually block light (but, would need to still place some spots on the building to simulate the light it reflects), or reproduce all the environment in an hdr image (could I do most of it in PS, for example, by hand?), and then use that image to produce the needed results.
Is this correct?
Anton42
It produces shadows.
a NON HDRI image has a range 0-255 coming from its byte encoding which allows a pixel to be at most 255 times brighter than another.
This can lead only to nice cloudy light, as the one you get with a uniform spots skydome in blender of with a YafRay Hemiligh, no distinct point where light is coming from.
HDRI dinamic range is herm high (Hight Dinamic Range Image - HDRI ) hence, if your probe is well done, you can have realistic situations whit a sun millions of times brighter tha any other pixel of the sky, yet these pixel sheding a appreciable light.
I think (and this is only and opinion, so you can and should disagree if I am wrong) HDRI allows setting lighting in a believable way much faster, than manually placing and tweaking lights. And, for the “real” project (whatever that is) someone might actually create their own .hdr image, to also produce realistic lighting…
Technically it is correct, Anyway it is tiresome at best
Stefano
Thanks, Stefano, that’s more information that helps me understand.
So, could I paint the probe in Photoshop with a broad brush (since precision is not key here) and then tweak the intensities in HDRShop, to create my “sunny sky with clouds” probe, and light the scene with it (instead of using GI sphere)?
Would it be faster than using GI sphere? Modelling- or rendering-wise.
The reason I am asking all this, is that I believe that every tool has its use, and I am trying to understand the use for the tool of HDRI-based lighting
Regards,
Anton42