Maybe someone can help me with some strange behaviour I am experiencing when using the new GLSL bump mapping in the newer builds of blender 2.5x.
Judging by the images and videos available on the web I guess something is wrong with my setup. See the attached image where on the left you see the glsl view and right you see the rendered output. With this it’s practically impossible to do serious bump map painting, especially for small details.
I tried various tweaks to settings, image resolution ( the one used in the example is a 2K texture ) and driver setttings.
I am running kubuntu 64bit linux with an nvidia gtx460 using the nvidia driver 260.19.36.
Looking at the comparative images I see only a difference in relative shadow and highlight values between the GLSL and the rendered versions. The shape and apparent depth looks very close if not identical. There is a hatch pattern of some sort in the GLSL image that seems to flatten the contrast of the highlights & shadows. Could this be just a limitation of GLSL viewing? I can see how it would make painting bump maps more difficult.
I have now cropped the image to only the glsl area, where the effect is better seen. The previous one got scaled down a bit.
You are right the that overall display is correct, and the farther I zoom out the better it gets, but as you start to go in ( and not very extreme) I get these strange artefacts. Its as if you are building the bumps from little squares. So these building blocks have their own edges which totally mess up the bump experience once you get closer.
This is completely off the cuff, but it looks as if it may be a matter of how your graphics card renders the “edges” of the bumps, particularly if the demos/tuts you’ve seen look very different than this. Have you tried adjusting any of your hardware options, like hardware AA and other things that can be switched on and off in the graphics card?
Yep, I guess all of them (AA settings, filtering, opengl quality settings etc). My guess is also that it might be card/driver related. I hope that there might be someone who is actually using this feature under linux and tells me what I might be doing wrong.
They say misery loves company (but I doubt it) – I can’t use GLSL at all because of the ancient graphics card (graphics crud?) I’m stuck with. But from all I’ve read 2.5x really requires some very current graphics tech to function properly/fully, it’s much less forgiving of hardware age than 2.4x and earlier.
Ok, below is a link to a minimal blend file with one texture packed, which shows the problem on my machine. Would be interested in screenshots of other users, especially linux, 64bit nvidia users
I don’t get the same results as you with your test file, here everything seems to be ok, I didn’t test it in linux though.
geforce 430 gt
windows 7 x64
In my experience the high-end nv cards perform extremely well quality wise with this technique so your issue is unexpected.
Would it be possible for you to try out running using the same configuration under windows?
I should mention by high-end I really mean d3d11 compliant nv cards in general.
Also don’t worry too much about the linux part. I know many people have used the new bump mapping in blender under linux with no issues. So hopefully this will get sorted out too.
Of my three functioning computers, only one has the OpenGL muscle needed for GLSL, yet it’s the weakest in terms of other specs, RAM in particular. Go figure. Anway, under WinXP Media Center v.2002 SP2, with on-board ATI Radeon Xpress 200 Series (256Mb GRAM), the rendering & the GLSL display versions are essentially identical.
Oddly enough, the machine with the ATI Radeon X700 cannot display GLSL at all.
I’m using a very cheap flatscreen monitor on the GLSL-capable machine, so that’s apparently not a factor, either.
If anyone is interested the method itself being used in Blender is in fact “bump mapping” in the strictest sense meaning it’s not using normal mapping. It is actually perturbing the normals from the height map directly. There is no underlying normal map.
There’s also a technical paper: mm_sfgrad_bump.pdf
that you can hit your programmer over the head with
Or read yourself if you’re interested in the technical side.
As a personal note I also wanted to make it clear that though the method is able to perturb the normals of a base normal map as David Radford shows in the tutorial this is not a requirement. You can also use just a height map. Dunno if this was clear or not?
@Michael W: Thanks for testing this for me. This again hints at some driver issues maybe.:eyebrowlift2: The different methos are interesting as well. The ‘ATI 5870 3 taps’ looks almost like the one I am having, but thats an ATI …
@mmikkelsen: thanks for the additional info. I too guess a gtx460 should perform quite ok in this area so the observed behaviour is quite unexpected to me. And as it seems from the other replies its just on my system.
I have windows7 on my computer running as well. I will give it a try and post the results. I will also check and test some different nvidia drivers (older, beta etc) just to see if this makes any difference.
btw the card performs admirably well when used with Mari, which uses some very heavy glsl rendering as well.