Just a thought I had about raytracing (for raytraced reflections really) inside of blender.
Even with the possibility of exporting to Yafray, it’s not really that good for doing animations, due to the long render times - so it would be great to get raytraced reflections inside of Blender…
I was wondering whether it would possible to do this by writing a texture plugin. When the scanline render hit a pixel of the texture plugin, the plugin would need to know co-ordinates of the point and the normal direction, then call the normal scanline rendering code to get the reflected pixel value…
Is this something that might be feasible - or fast enough to be worthwhile?
Pixar’s PRMan renderer can use BMRT to do raytrace solutions, could something similar be used in Blender? I believe PRMan is a scanline type renderer, but don’t know if it’s so different from Blender’s rendering engine that it wouldn’t be possible to do something similar. The main problem I see is that PRMan and BMRT both use the Renderman standard, and I don’t know of any raytracer that uses the same format as Blender.
Then again, even Pixar stopped using raytracing in it’s animations, with a little work environment maps can simulate reflection and refraction that looks as good as raytracing for most situations so maybe it’s nor worth the time to come up with the code for this?
I was actually wondering whether you could call the actual Blender rendering code from a texture plugin… Then you wouldn’t have to worry about external software etc… But I know almost nothing about serious programming though!