i think if you want to improve shadow quality, you should turn on highbitdepth , shadow mode should toggle vsm, improve indirect reflection times.
Ok, I never tried it as a color (although I use a separate one without mortar to create random colors after a noise node - added to coordinates for image texture lookup). I guess I have to ignore the bumping and use it to apply a darker shade instead then. The mortar smooth seem to be a linear gradient, which I feed through a smootherstep function for very nice bevels. Not sure how to achieve that effect as a color since its purpose (catching better highlights) is completely lost.
Simplified version working in closeup:
And completely broken when zoomed out:
(Previous attempts used 0-1 as the input smootherstep points, I was just messing around here).
And no way to report bugs. How to remind myself when the regular bug tracker starts?
Part of the issue here could be that AA in realtime (in general, not just Blender) is a bit trickier to pull off at interactive framerates than the same thing in offline rendering.
From reading the various techniques used in games, the best technique we have now is one from Nvidia that automatically downsizes footage rendered at a higher resolution (ie. old-fashioned supersampling applied to realtime).
you can try to bake the bump to a 16 bits high resolution map
This type of artefact is called shader aliasing, and the only way to combat this is to supersample, either spatially or temporally. It happens because with normal (MSAA) antialiasing only geometry edges are multisampled. The interiors of faces are still evaluated only once. With a texture you can filter it away with texture filtering, but with a shader this is problematic because the shader would just return the same value each time.
You’d think Eevee’s “temporal” antialiasing would solve this, but apparently nope.
How does Eevee get around this problem for the color channel though? (see my test example above)
For bump mapping Eevee uses derivatives provided by the hardware. GPUs shade in blocks of 2x2 pixels which means derivatives can be computed “for free” by reading neighboring pixels within that block, and I can imagine it causing this type of artifact.
The solution may be to manually compute the derivatives at the cost of performance. Not sure it would even be considered a bug at this stage, there’s dozens of render quality issue like this that could be improved.
For image textures you also ideally need cubic interpolation for smooth bump results, which Eevee does not support currently. But there is linear interpolation, which is better than closest which you effectively get with a procedural brick texture.
Closest interpolation does sound like a valid reasoning.
Ace mentioned supersampling and downscaling as an option. I haven’t tried it yet with the full image, but I’ll try and see if I can handle it memory wise. Is this something that in future could be done internally only on “problematic channels” (like bump) only? Maybe setting for viewport and rendered?
@gritche: I’ll try 8bit bw with cubic interpolation to see if this is enough. For random nontiled appearance I prefer procedural over image (or procedural to lookup image - random woodgrain parts per tile, but not tile itself), and each time I tweak the seed I’d need to render out a separate (possibly massive) bw bump texture - not ideal for me.
Edit: Oh, right - cubic not supported yet
I also just realized today that Eevee indeed renders all bump map textures at 50% of the actual screen resolution, and there’s still no way of correcting that.
This will be fixed eventually, right?
I am making extensive use of bump maps on every video game asset that I produce at my day job. I won’t be able to work with Blender anymore if the rendering in Eevee remains as broken as it currently is.
Hi, keep in mind Blender 2.8 is in Alpha status-
Not feature complete, hell of bugs, crashing, .blends are not backward nor forward compatible.
Wait for Beta -> feature complete, hell of bugs, crashing .
Or better Release Candidate for serious work but not production.
I am keeping that in mind. Still worried though. What Brecht said further up in this thread sounded really alarming.
yes using bump node in eevee result in this aliased / pixelated look.
With normal maps it’s working really fine.
shot in the dark here but have you tried changing the settings on your graphics card configurations for blender ? maybe that might work
Yep, and that’s the problem.
Not that I know of. And in any case, it shoud not be necessary. The graphics should not be broken by default.
Any updates on this problem? I’m still rendering all my shots at 2X resolution and then downscaling them to get rid of this half-resolution normal map appearance.
Bake the bump map, normal maps don’t have this kind of issue
or you can use GIMP or xNormal to convert the bump map into a normal map
I think downsampling is the only way to go unfortunately.
Baking in some cases would need ridiculously high texture size, unless you’re prepared to bake out each frame in an animation for window texture coords.
How about implementing your own bump node for now? It’s basically just checking the neighbor pixels if they’re brighter or darker then averaging the result for the X and Y axis. I might try to do it if I have time, it seems like an interesting challenge
It’s exactly like Brecht says, that’s what bump-mapping with the built-in GPU derivatives looks like which are computed from 2x2 blocks. That’s why it looks half-resolution. Computing the derivatives manually would require extra texture samples. That’s why games generally just do normal maps. I think for Eevee computing the derivatives manually would be fine.
By the way, Blender Internal used to support baking derivatives maps. They never really caught on, but they combined all the benefits of bump mapping (correct blending, no need for tangents) while requiring only a single texture sample at full quality.
Oh man, does eevee not support them? I hope they do return.