I’m new to Blender world and maybe my question has an easy explanation, but I’ve encountered a strange behavior of the internal rendering engine in my test.
I’ve a scene with a UVSphere, some lights and a ICOSphere simulating G.I. Of course I have a material also, and I’ve applied that to my UVSphere.
The material is a simple simulation of a tree bark, it include some textures, 2 of those textures are mapped to Displace my UVSphere, to render a sort of crack in my surface.
And now the problem!
1 - When I render using the spots only as a source of lights and disabling Radiosity for my scene I obtain this result:
2 - When I render using the spots and the G.I. ICOSphere as a source of lights and enabling Radiosity that’s what come up on my screen:
It seems like the Radiosity function can change the amount of my displacement mapped textures by its own decision…
How explain that? The Material is the same, the Textures are the same so the Camera, the Lights and the whole Scene are.
Who is the responsible to those differences? How can I achieve a single result for Rad and No Rad rendering?