Yafray v. Blender Internal: Different Images

When I render a simple scene with Blender and Yafray (without changing any settings between the two, it produces two different results. Why is this? (the one with the background is from Blender).

Thanks,
V

Attachments



Short answer is: yafray doesn’t support everything.

Mystery

Indeed, quite a quandry.
The internal Blender render engine is just not good enough and the external ones are not compatible.
I find the interface and usability of Blender quite acceptable, but when it comes time to get my image out of it I end up with a fight. (that I generally lose and give up).

O_o The couple times I’ve used yafray I didn’t get anything like “your camera / scene is suddenly in a completly different place” . . . .

That’s really got me going WTH. Maybe post the blend so someone who actually knows more about yafray can take a look at your camera setup?

yafray is a raytracer and Blender is scanline. Yafray can’t just pre process a gradient background and post process the image to put that background on the back (Which is what Blender does), because it is not real, it is cheating.

In yafray you must reproduce that background like any other element else in the scene, with physical properties. In Blender, a background is just an image behind. In yafray, the background is part of a physical work, able to be intersected by rays, following typical math calculations.

That is why some things just don’t work in Yafray. For instance, Yafray can’t compute discrete elements without mesh info, like particles.

Yep Alvaro is right about the background, only the “real”+“blend” background type would be possible in yafray’s concept (which i actually implemented in the re-designed version already).
It does not (and will not attempt anytime soon to) do all kinds of z-buffered scene compositing, like “paper” backgrounds, 2D-particles etc.

Why it’s rendering upside down i don’t know, but i think there’s still a problem with dupliframe’d objects…