Antialiased depth / world position in compositor?

Hi everyone,

I’m currently making a post process fog effect in the compositor where I use the depth and the pixel world position (denser fog lower, and it thins out when it gets high). The effect works nicely, but because neither depth nor position are antialiased, there are very visible aliased pixels, especially around out of focus objects. I’ve played a bit with the mist pass which is indeed antialiased but
1/ I get wrong results around some foliage (here the ground should be as black as the house, but for some reason it’s not when there is instanced folliage on it)

2/ I still get the aliasing when I use the world position on top of it.

Is there anything I’m missing for such a simple effect ? Is it just not possible in Blender ?

Thank you !

1 Like

You may post some magnified areas to show the details… not much to see here…
maybe also you setup ?? And if you thing the non-antialiasing is the culprit… then why not try some blurring ? ( ← just a wild guess because i do not know your setup… :person_shrugging: )

I don’t think Blender has a proper tool to manage masking using position data, at least I never saw one. For that case I would use a simple box mask multiplied by the mist pass. That foliage is indeed very strange, something in your compositing script must be breaking off.

Make sure your scene has enough depth:

Hi,

Thank you for the reply, that’s what I feared… I’m not a fan of the box approach because I’m making an animation with camera movement, and I just don’t want to animate that box by hand… Also, its screenspace nature is limiting, I can’t add 3D noise or anything to the height. Shame…

For the mist issue though, I’m not sure where it comes from, what do you mean by compositing script ? This is just the mist pass straight out of the renderer, and as you can see it looks correct when there is no grass :

But when I add grass, it adds this weird mist below, as if the ground wasn’t rendered, but it is (and it looks correct when I use the depth pass, save for the aliasing issue)

Thanks anyway :slight_smile:

The position output is allways aliased!
The reason is that there’s a fight between near and far objects, and a pixel “needs to choose” which position it’s supposed to show. Interpolating all positions that may fall into a pixel wouldn’t make any sense.
What I’ve done in the past to fix this situation, was to render a bigger image (2x or 4x the original size), and use that in the compositor… Thought I don’t do that anymore, and whenever I want to add some effect like haze/fog, I create a new view layer with some custom materials and compose that.

Is the grass a full geometry thing, or does it uses alpha masks?

2 Likes

Thanks for the reply :slight_smile:

I assumed so, but I wondered whether Blender could create a specific one for compositing, but I guess not :frowning:
Rendering at a bigger size is not really an option as my frames are alrady quite time consuming, for the animation (unless I could just create a 2x bigger depth and position map, without rendering the whole frame twice bigger ?)

I guess the additional view layer with a custom material could work on principle, but alpha materials would be a problem, right ? I would get the depth/height of the geometry, unless I go and adapt every single material by hand…

As for the mist pass issue, the grass is indeed using an alpha mask, are they not compatible ? It seems to work okay on the trees for example

Doesn’t have the mist pass the Start and Depth parameters in the World Properties for the purpose of bordering it ? Which also can be made visible in the camera settings: Viewport Display → Show: Mist.

1 Like

Yes of course, but as you can see in my last message, the mist correctly configured, and it works as expected when there is no foliage. The issue is quite clearly a bug that appears with the alpha blended foliage.

There’s a threshold when using alphas… the position result may be full opaque, or completly transparent.

I don’t know why is it behaving like that for the grass… I’d need to take a deeper look to what your mist setup is currently doing. But if you’re using the Mist alone, then there are known problems with alpha channels (transparent values clamped by LightPaths, get pushed to infinity!).

As mentioned:

…and you wrote about:

or before that about “box approach”… so i don’t know if you used/ can use this parameters…

Using Blender I can only think about rendering a volume masked by a gradient. But as you mentioned, screen space is limited and it’s more expensive. You could separate the volume in another layer, holdout everything else and render only the volume using lower sampling, denoised.

Using Nuke there’s a third party tool, P_Matte, that can mask in 3D space using position data.

Grass still very off, maybe it’s normals. Check your shader, maybe there’s a node doing some bad math, negative or infinite numbers.

When you’re conecting nodes, internally you’re creating a script.

“Nodes” are literally a form of "visual computer programming." You are not “writing a Python script” (although you can), but you are “programming” nonetheless.

I’m also thinking in terms of: “could you just use a gradient?” You said that “the top” should be softer than “the bottom” …

There must be something odd about that grass. Can you temporarily put anything else in there, with reasonably-complex geometry, to see how it is handled?

The transparency from the grass, is cutoff after some ray bounces…
And when the transparent bounce limit is reached, the render puts an holdout and doesn’t propagate rays any longer.

But the holdout has an infinite distance from the camera (no matter where it happens on the scene), and that messes up with the mist filter.

Here’s a scene where the grass problem is visible and understandable (Transparent bounces set to 5):

@Secrop Okay, that makes sense… So that means the solution is to set the transparent bounces to a much higher amount, or to essentially scrap the mist pass, go back to using depth, hoping a bit of blur will smooth out the edges without creating artifacts ?

Thanks !

@lucas.coutin, yeah I could use a volume emission for that, but given the render times, I’d love to have control over the fog amount in compositing, as to avoid having a client make me re-render everything because it’s too strong/not strong enough :stuck_out_tongue:

@sundialsvc4 Thanks for the clarification, I do understand that, it’s just that there was litterally no nodes, I was simply displaying the mist pass on its own, which is why I was confused :slight_smile:

That would work with mist… but for the grass, there’re so many overlapping transparencies that you might need a very high value for the max bounces.

It would also work… There’s an Antialiasing node, if you prefer; it’s not as strong as blur.

Another possibility, is to make the grass/mist as a secondary layer using Eevee which deals with overlapping transparencies a bit better.

1 Like

@Secrop Yeah that’s what I thought… I’m actually playing with compositing in Resolve Fusion for the first time, to work with more advanced color correction.
I’m not a fan of additional steps, especially since I have quite a few shots.

Here’s the first draft btw, quite satisfied with the general look, but I’m afraid I’ll have to pump the blur quite strong to loose the mist noise, which will definitely lead to some artifacts… :confused: I’ll have to see if I can balance that, and if I find some magical solution, I’ll share it here.

1 Like