bad displacements

Hey all,

I did a search but didn’t come up with anything… I’m testing displacements on a fairly high-poly mesh (cube -> subdivide multi (10) -> subsurf level 4), and the test image I did shows a serious rendering error. I wanted to get advice and see if I’ve done something wrong before submitting this to the bug tracker.

Here’s the image:

http://ministryofdoom.org/cloud/blender/20060928/bad-displacement.png

As you can see the squares come out fine (the corner artifacting is due to the fact that i did them lazily and they don’t line up), but the spiral, which was painted with a simple stroke, makes the polys go nuts. The image is only effecting Disp. I’ve tried on both cube-mapped coordinates and UV-mapped, but there was no difference.

Anyone else had this issue?

Still testing… a related issue is that displacements seem to rely on camera angles to the surface normal? Look at this video:

http://ministryofdoom.org/cloud/blender/20060928/bad-displacments2.mov

Watch the top. See how the displacement changes as the camera moves up close to the end of the clip?

Try turning the Nor slider down very low, maybe starting about 0.01 and working up from there. This setting affects Disp textures even though the Nor button isn’t on. It’s differentiating the sharp edges of the spiral and giving extreme results.

… Well, that was the quickest fix in history. Now everything seems to be rendering just fine. Is this an intentional thing, letting the nor value effect displacements? I can’t for the life of me understand why there are two values for two types of displacements being used at once.

Thanks for the help! ^o^/

I can’t for the life of me understand why there are two values

Nor did I, but then I found this in the manual. It’s all clear now.

I don’t see why displacement mapping can’t just automatically tesselate the displaced parts of a model to get the needed faces… That way the whole model doesn’t have to be subsurfed.

Lightwave 9 actually solved this in a really ingenious fashion – they basically allow sub-object subsurfacing, so that the areas which need extra polygons get them (set up a vertex group to mark them) while the other areas stay lower-res.

@CD38: Should have searched the manual too… I always forget that part. ^_^; Then again, even after that explanation displacements still seem a bit odd. But at least I have them working now, thanks to you.

Well, I guess you’d still need micropoly displacement to get that to work properly…and sadly neither Blender’s internal nor Yafray support that yet. Therefore you have to subdivide the mesh manually but still get bad results on such complex textures like that spiral (yes that thing is complex).

I’d personally advise to use real modelling here…that spiral is easy to make and needs less verts than using displacement, I guess…

But yeah, since you only wanted to test it…it’s more like a technical limitation than a bug.

Micropoly isn’t needed, really. You could divide up sections of subpatches to be more dense so long as you take into account the angles along section edges where there’d be one section of higher poly count butting against a section of lower poly count. Done carefully you could keep the verts in the higher-poly section either coplanar with or in some way lying along the edges of the lower-poly section, which would maintain the illusion of solidity.

I’d personally advise to use real modelling here…that spiral is easy to make and needs less verts than using displacement, I guess… But yeah, since you only wanted to test it…it’s more like a technical limitation than a bug.

There’re a number of reasons I don’t want to model what I’m about to make. For starters, displacements are nice because they allow you to have an extremely low-poly cage for animating, and I’m going to be doing some limited motion with this. Also, all changes happen in 2D instead of 3D, meaning I don’t have to constantly rebuild a shape as the client changes their mind. I’m going to be working with a fairly high level of detail in grayscale, detail where carving in 3D would be killer. ^_^;

kattkieru, I see where you’re coming from (and I also have to say that I’m a REAL vertex pusher…all that mapping stuff isn’t for me :wink: ). Indeed, displacement would be nice to have but I can only repeat myself and doubt that it will work in Blender anytime soon.

And the method you’re speaking of…is it LW specific? I guess that would be an even better approach to displacement mapping, actually. But to be honest, I didn’t know about it before. But you learn something new every day :slight_smile:

I think we established above that displacement works just fine so long as you A) UV Map the image to the object; B) blur the map sufficiently to allow grading; and C) for the love of god, turn off the Nor amount in the texture buttons.

There is a lot of wasted subdivision going on, but I can live with that.

Something that would be really nice, though, were if the Displace Modifier in CVS used UV texture coordinates.

And the method you’re speaking of…is it LW specific? I guess that would be an even better approach to displacement mapping, actually. But to be honest, I didn’t know about it before. But you learn something new every day :slight_smile:

Actually I know that Maya has adaptive subdivision surfaces and has for a while (Lightwave just got them in version 9), but in Maya you can play with them more directly, adding detail to areas that need it and removing detail from areas that dont. In that respect it’s a lot more controllable, I think. It wouldn’t surprise me in the least if XSI had a similar feature.