Zbrush and blender displacement tests!!!

http://www.zbrushcentral.com/zbc/showthread.php?p=271055#post271055

Apparently if you use a Sub D- Level of 4, you can get zbrush maps to work. If so I will test my forearm out this way. Btw to export an object into zbrush, you have to apply the sub-d modifier so it converts the sub-ds to polygons. I would use a sub-d level of 1. I dunno if you can use 0 which would be even better. You may want to forgo using sub-d’s during the export. This work rather well for exporting objs into Maya.

eh, blender lacks what it needs, REAL micro displacements … other wise it’s just tooo slow to tinker with in animation

It’s useless to say that you need a sub-d level of this or that as it totally depends on the mesh.

If your exporting to maya you really don’t need to sub divide the mesh. It may be best just to leave off the sub-d’s. Yes for animation it may be useless, but can’t Blender use normal maps?

normal maps are good but they are rather made for game design to increase the level of realism and maintaining the speed of the 3d engine.

for photorealistic rendering i think it is not a tool you always want to go,
specificaly when you have access (and i hope soon we have better displacements) to micro displacements.

ah well depends on what you work on anyway.

Yes I’m aware of that. But it is uncertain when if ever blender will get Micropoly displacement.

For now normal maps should be the way to go.They are cheap and look good.
Microdisplacement isn’t always a good choice,because(probably)the implementation will make the displacement slower than now,Blender isn’t a reyes renderer.
I’m testing blender and zbrush,trying to use Zbrush as a modelling tool and for subsurfed(3 level or 4 level)meshes the result for low frequency details are good(but you need to use or two 8 bit displace maps or one 32 bit displacement maps(tif converted to hdr).For high frequency details a bump give good result,but with normal maps the results could be much better.

blender only does normal maps on flat surfaces… usesless for round surfaces

It would be nice for blender to have micropoly displacement, but I think the render recode will make it easier to add features such as that.

So perhaps a feature that would stem from the 2.42 render recode after 2.42.

I really need to get ZBrush. I got the demo, but it won’t let me import stuff. But it looks like such a powerful tool, and it seems to work well with Blender.

Micropolygons offer a lot of great advantages, for example you don’t have to worry anymore about tessellation and always have a perfectly smooth surface. Another aspect is the sub-pixel displacement. It would be awesome to see this in the internal renderer because of its tight integration although I don’t know how likely that would be. :wink:

Normal maps work also well for adding high-frequency details to a mid-range poly model, so you don’t need such a high level of subdivision as compared to displacing the whole thing. Unfortunately Blender doesn’t support tangent space normal maps except for on flat surfaces.

Matt :slight_smile:

sorry womball for using your post, but it had a lot of views, I need help. I don’t even know how to create a thread can someone help me?

Dreblen: there’s a “new thread” button in the upper left part of the forum page.

thanks yfkar, I really needed help

i am not sure how much even normal maps can compare to micro displacements. the reason why i say that is because true displacement not only give stone surfaces or other irregular surafces a much better result because they are also when procedual, resolution independent, but further more can true displacement also be seen as a modeling tool because they create a true 3d mesh. specificaly at edges you will see the differences.

but well if you dont need 3d edges it is fine.

claas

What about texture level parallax displacement with occlussion mapping? Then you don’t even have to add any geometry… I like to keep my poly count low if at all possible.

… hasn’t this discussion come up before?

I have been wanting a decent render-time displacement for a while, but it doesnt seem it is in the pipeline for blender at the moment. What seems like a more likely option is that blender will get a decent exporter to aqsis, which does support per pixel displacement renders. I havent used aqsis yet, but if the interface in blender could support renderman complient renderers (aqsis is a renderman renderer) like it supports yafray (and with a patch, povray), then that would be sweet.

MicWit

i would rather prefer 3delight to be included because it offers a more complete set of render options than the other free reyes engines.

as far as i know Ton and the render coders have plans for a reyes structure which i think will include better displacements.

time will show

claas

3delight seems to be a renderman comlient renderer as well, I think if one renderman complient renderer is supported, they would probably all be, as the export is a standard. I reckon that what should happen is that in the main drop-down list, you can choose a “renderman complient” option, and then in a tab (like in yafray) there will be further options, like which renderer you would like to use (maybe a dropdown list of some that plug straight in).

So in 3delight there is support for “per pixel” displacement at rendertime?

MicWit

Every renderman compatible renderer should have microdisplacement. This is how Reyes works.