Apparently if you use a Sub D- Level of 4, you can get zbrush maps to work. If so I will test my forearm out this way. Btw to export an object into zbrush, you have to apply the sub-d modifier so it converts the sub-ds to polygons. I would use a sub-d level of 1. I dunno if you can use 0 which would be even better. You may want to forgo using sub-d’s during the export. This work rather well for exporting objs into Maya.
If your exporting to maya you really don’t need to sub divide the mesh. It may be best just to leave off the sub-d’s. Yes for animation it may be useless, but can’t Blender use normal maps?
normal maps are good but they are rather made for game design to increase the level of realism and maintaining the speed of the 3d engine.
for photorealistic rendering i think it is not a tool you always want to go,
specificaly when you have access (and i hope soon we have better displacements) to micro displacements.
For now normal maps should be the way to go.They are cheap and look good.
Microdisplacement isn’t always a good choice,because(probably)the implementation will make the displacement slower than now,Blender isn’t a reyes renderer.
I’m testing blender and zbrush,trying to use Zbrush as a modelling tool and for subsurfed(3 level or 4 level)meshes the result for low frequency details are good(but you need to use or two 8 bit displace maps or one 32 bit displacement maps(tif converted to hdr).For high frequency details a bump give good result,but with normal maps the results could be much better.
I really need to get ZBrush. I got the demo, but it won’t let me import stuff. But it looks like such a powerful tool, and it seems to work well with Blender.
Micropolygons offer a lot of great advantages, for example you don’t have to worry anymore about tessellation and always have a perfectly smooth surface. Another aspect is the sub-pixel displacement. It would be awesome to see this in the internal renderer because of its tight integration although I don’t know how likely that would be.
Normal maps work also well for adding high-frequency details to a mid-range poly model, so you don’t need such a high level of subdivision as compared to displacing the whole thing. Unfortunately Blender doesn’t support tangent space normal maps except for on flat surfaces.
i am not sure how much even normal maps can compare to micro displacements. the reason why i say that is because true displacement not only give stone surfaces or other irregular surafces a much better result because they are also when procedual, resolution independent, but further more can true displacement also be seen as a modeling tool because they create a true 3d mesh. specificaly at edges you will see the differences.
What about texture level parallax displacement with occlussion mapping? Then you don’t even have to add any geometry… I like to keep my poly count low if at all possible.
I have been wanting a decent render-time displacement for a while, but it doesnt seem it is in the pipeline for blender at the moment. What seems like a more likely option is that blender will get a decent exporter to aqsis, which does support per pixel displacement renders. I havent used aqsis yet, but if the interface in blender could support renderman complient renderers (aqsis is a renderman renderer) like it supports yafray (and with a patch, povray), then that would be sweet.
3delight seems to be a renderman comlient renderer as well, I think if one renderman complient renderer is supported, they would probably all be, as the export is a standard. I reckon that what should happen is that in the main drop-down list, you can choose a “renderman complient” option, and then in a tab (like in yafray) there will be further options, like which renderer you would like to use (maybe a dropdown list of some that plug straight in).
So in 3delight there is support for “per pixel” displacement at rendertime?