I am trying to render depth maps and surface Normal maps of different 3D shapes using Cycles. What I care about the most is how much of the details of the 3D shapes are captured in the depth maps or surface Normals.
To obtain surface Normals I first do
bpy.context.scene.render.layers["RenderLayer"].use_pass_normal = True
or just click on Normal in RenderLayer settings and then render Normals.
I noticed that Blender’s internal renderer is able to capture the details quite well as shown below (not for depth maps though). By details I mean the transparent materials such as glass are not affecting the Normals which is what I want.
However, a problem with the internal renderer is that the surface Normals depend on the camera angle which is not what I want. Even if there is a way to get camera view invariant surface Normal maps using Blender’s internal rendering, this is still not going to be what I want due to some technical reasons. Here’s the results:
Using Blender’s internal renderer:
However, when I switch to Cycles I get the following rendering: here, although the surface Normals are invariant to the camera angle.
Although the Cycles materials also have properties such as transparency, the renderings do not capture any of the details that I am interested in. This is also the case when rendering depth maps.
Depth map rendering using Cycles:
As you can see all of the details that I am interested in have not been captured in either the depth map or the surface Normal map. Does anyone know how can I create node trees to enable me capture detailed depth maps and surface Normals that reflect details of 3D shapes with respect to materials?
I also share the .blend file I used for getting these results:
I also upload the obj file used here with its corresponding .mtl file . Note that if you load the obj file in while Blender Render is activated, the material will only work for the Blender’s internal renderer. So you may select Cycles and then import the obj file.
Download the obj file