light baking to vertices

Hey renderdemon. - never thaught of adding in that.
not supported but adding support would not be that tricky-

I have yet to see worldspace normalmaps put to use and was hoping they could be overlaied and renormalized in the gimp, with the lower poly mesh to create tangent space normal maps- maybe this could be automated later.

Are you sure the current normal map exporter is usefull to you? if so, Ill add in animation support.

Cambo,if you can add support to animation it could be VERY useful(I need a sequence of uvmapped image with backed objectspace normals,possibly with extruded uv seams).
I explain better.
I’m working on subsurfacescattering,I think I have the trick to do it reasonably good(and fast).
But my method works only for animation of static object(rotation,scaling and translation),not deformed meshes(armature and morphing).
This because I use my custom node for doing object space normal mapping,which does the scattering stuff.
I use it (and not the blender normal mapping)because blender normal maps are wrong,the normals are not trasformed to camera space like my node does.If we can use tangent space normals maps I don’t need this feature,but:
A)I have problem with my methods(i have tried to add support to tangent space too),something is not correct and I don’t know what.
B)The recent Blender addition for supporting tangent space normal maps don’t work well(give problem at uv seams,because the tangent are continuosly interpolated on uv space,but they need to be interpolated only the tangents on the same uvisland)
C)Last,and worse,even if correctly computed and rendered,tangent space normal maps are enourmeos difficoult to use for this stuff,is difficoult intergrating them,because they are deltas.
Now,if I can pass,via a texture,the object space normals,I can integrate them simply by using the texture filter!
For multiscattering, it works beautifully,take a look here:
http://blenderartists.org/forum/showthread.php?t=78400
Other choice could be doing a python script which,based on a radius,average the normal an,using a material with shadeless and vertex colors,pass this information to my custom normal mapping node but I think that can be extremely slow(well,maybe using your octree can be faster)but a preliminary trying in doing this makes blender crashed,I don’t know if updating from frame by frame vertex colors works.

Cambo,

Here’s a quick test - I set it up with exaggerated Yafray settings(so you have an idea of what I’m talking about), I can’t seem to get anything similiar with Blender Internal. Maybe BI doesn’t bounce light beyond the first tracing?

http://home.earthlink.net/~severnclay/images/lightingtest.blend

Sorry this is such an esoteric problem…

RS

the area your rendering happens to be in a shadow,
Try using an area light, or additive AO. - Arealight worked for me in your test.
just make sure its abt 4+ size with 4+ samples.

Cambo,

If you look at the setup (I told you this was an esoteric problem), the horizontal plane should be getting reflected light from the white cube next to it. The arealight would work, but wouldn’t give any indication of the amount of light actually bouncing off the cube. I guess Yafray is going to be best at more optically-correct lighting in any case - I just need to pare down the problem size so that rendering time doesn’t kill me.

RS

This it prettu much todo with lighting in blender only, blenders lighting dosnt do photon bouncing, you can do reflections with noise to simulate it. or glass like reflections…
You seem to be using trying to use radiosity, but for that to work youll need to have an emitter and (mesh with emit material value > 0) and the plane needs to be subdivided that your rendering.

Cambo,

How do I access the render window buffer? I’ve tried Image.Get(“Render Result”), but that doesn’t seem to reliably find the image that’s just been rendered (sometimes it doesn’t find anything). Is there a better way? I’d send code, but I can’t get the problem to reliably happen one way or the other.

Thanks,
RS

Its crufty at the moment, you have to save the render resull, then load the image into Blender

See the BPyRender again for an example.

RenderResult isnt strictly an image, so accessing its pixels may have to be done via the renderContext if at all