We should have AO map (not material) in Cycles

Hi,

one of the things I am most missing in Blender is an ambient occlusion map. Cycles only has AO material, which has one, globally defined distance and can not be used as a map. In 3ds Max, I’ve been utilizing AO map for years to create procedural material effects, such as edge wear and dirt. Here are some examples:



neither of these 3 models have any UVs. It’s all just box mapping projection with AO maps driving the distribution of dirt and wear & tear effects.

Unfortunately, in Blender, I am unable to achieve that.

1, As I already said, there’s only AO material in Cycles, not a map, and AO radius of this material is driven globally, not per material. Even if it somehow worked, it’s still missing options such as inverting ray casting normals to create edge wearing effects.

2, There’s a curvature map and option to convert curvature to vertex channel. Both of these options are unfortunately not sufficient, as they rely on a model topology. I need to keep my polycounts reasonable and my workflow quick and flexible enough, so I can not afford to modify my mesh topology specifically for materials. It would make my process much more slower and my resulting models much heavier. A simple example is that you can’t really use vertex color or curvature map to create procedural on a simple 6-quad cube, while you can easily do that with an AO map.

3, There has been OSL shader that does this, but unfortunately:
A, It does not seem to work with Blender 2.79
B, It seems very slow (compared to a speed of native implementation)
C, It doesn’t work with GPU mode

I already found out a proposal for this map here: https://wiki.blender.org/index.php/User:Gregzaal/AO_node_proposal

So my question is, what would be a best way to convince some of the Blender developers to implement this? AFAIK, implementation of AO map is something so simple it could take just 1-2 days of a single developer. I would be even willing to pay for that time. But there do not seem to be any feature request channels. There is a bug tracker specifically made for bugs, not for feature requests, and only site for feature requests is a Right Click Select (a name I have very strong aversion to) which Blender developers supposedly don’t even visit.

Thanks in advance

I’m probably going to flop a bit technically in my explanation, but as I understand it, the problem with AO in Cycles-like render engines is the way rendering process works in pathtracers. Ambient occlusion is basically geometric coverage, meaning it expresses how much of world sphere area over a surface point is occluded by other geometry within a set distance. If whole hemisphere is uncovered, we get value of 1.0, when everything is covered, we get 0.0. Now, how does the render engine get this result? It samples the hemisphere over that point and the result is the ratio of rays that didn’t hit anything to number of rays cast. And now we get to the problem of using AO in a pathtracer like Cycles. We reach the actual AO value after tracing all the rays, but we want to use the result as a texture map already on the first sample. But we don’t know the AO value at that point yet!

I think one possible solution is to simply use the boolean value of the ray as the map value for that sample. This should converge to the same result, for example if we have 6 samples that hit geometry and 4 that don’t and we use AO map to mix two textures, the resulting texture should still be a 60% blend as expected. But what is impossible to achieve with this solution is the modification of AO map values. For example, we can’t use any color manipulations on it (colorramp, simple gamma, clamp etc) because these depend directly on the final accumulated AO value, not simple boolean value for each ray. And this is the main catch in implementing AO as a map I believe.

Nope,

there’s nothing preventing path tracer like Cycles to have AO map which can be modifier by color mapping operations afterwards. I do believe that Cycles is probably the only non-experimental path tracer that is missing an AO map. V-Ray has one, Corona has one, Fstorm has one, Octane has one and Arnold has one too. They all are path tracers, they all can do it. There is no technical limitation anywhere. I could go into details, but there’s not much point in it since the fact all other renderers can do it is a sufficient proof.

Furthermore, Cycles already can do it using the OSL shader: https://www.youtube.com/watch?v=L5eH1JlXGNE

It’s just slow, doesn’t work on GPU and doesn’t work in 2.79. But it’s a proof that it is possible. All that’s really needed is a native implementation of quite trivial map.

May I ask how it is done in these engines? You can go into technical details, I don’t mind. Because in my understanding, unless the AO value is probed before, turning rendering into a two-pass method, it is the same as querying a shader result as a texture, which is also not possible in pathtracer.

As you can see both from the results above and from the fact that it 100% works in all the other renderers I have named above, it is possible in a path tracer, even without any prepass or two stage rendering.

It’s actually quite trivial. Imagine you have a diffuse material, in the Diffuse color you have Mix RGB where you are mixing red and green colors, and you are mixing them using Checker texture in a Fac slot. Every time a ray is shot from camera onto a surface, it won’t know what color is there until it evaluates Checker texture in a Fac slot of MixRGB node. Once it does so, it identifies what color is at a given intersection, and then continues to shoot let’s say 8 GI rays from that point, which when land, repeat the same process.

Now simply swap Checker Texture node for AO node. Ray is shot from the camera, hits a surface at intersection point and proceeds to identify what color is there. It will therefore evaluate an AO map in a Fac slot, which will trigger shooting of let’s say 16 AO rays, returns results, and that result then defines diffuse color at that intersection point. It’s as simple as that. If it then spawns 8 GI rays, those rays will repeat the same thing once they hit their destination.

There is no need at all to know anything in the advance. It’s pretty much same thing as GI rays or glossy reflection rays. There, you also do not know any illumination in advance. So simply, instead of hitting a surface and spawning let’s say 8 GI rays and 8 Glossy rays, it will also additionally spawn for example 16 AO rays, which are, however, much cheaper to calculate than GI rays or Glossy rays, as they don’t need to trigger shader evaluation on their hit.

Ok, I understand the logic and how it would work. But afaik there are no additional rays spawned in current Cycles implementation. One sample is one ray and that’s it. How easy or hard it would be to change this behavior, I don’t know.

It seems that AO as a map rises some interesting chicken-egg like problems. For example, if I use AO map as a displacement texture, does the AO get re-evaluated after displacing the geometry or not…

This would be nice to have, and I know Brecht was considering implementing this recently in relation to the Bevel node. The bevel node is doing the same ray multiplication as described above, and does suffer a performance impact, but machine time is cheap compared to artist time.

You can abuse the bevel node to get a rough edge mask with some caveats. Most problematically, it doesn’t distinguish between an inside corner and an outside corner. But it can be pretty helpful none the less:


Hey, I did not know there was a bevel node! I would not even think of a shading node called bevel, thank you! :slight_smile:

@kesonmis: Yes, you are right here, in none of the renderers, the AO map can not drive displacement or other geometric effects such as that, but that’s fine. Most of the people are not bothered by such limitation, since they don’t use AO/Dirt map for that kind of stuff anyway. V-Ray and Corona have special “Distance” map for that. But driving geometric effects is really one of a few rare cases, where it’s a chicken vs egg kind of problem, and if these cases do not work, then it’s completely fine :slight_smile:

But I can not agree with the one sample is one ray thing, because:

1, Cycles can do branched PT

2, Branched PT is not even required for things like GI and Glossy reflections, so it’s not required for AO either. AO is basically same thing as GI, with only two differences: 1, You don’t evaluate actual shader at the hit point, you just evaluate ray length. 2, You terminate the rays after specified distance is reached. As long as GI in cycles works, it means AO will work too.

3, If that was the case, the AO material would not work either, yet it works. You can simply think of Cycles AO material as a hardcoded compound of an emission node that has AO map plugged in a color slot.

As Sterling Roth pointed out in his example image, the question of whether it’s possible is no longer needed because Cycles is now capable of producing a map based on tracing rays (the bevel node works by making use of rays being traced inside of the geometry itself). In fact, it’s a method similar to a key part of what makes raytraced SSS shading work.

All we really need now is an AO/curvature node designed for outputting color data (which the bevel shading code should provide a foundation for).

Yep, so anyone knows how to bring this to developer’s attention and persuade them to implement it? :slight_smile: (BTW I would not mix AO and curvature terms. AO outputs data based on cast AO rays, while Curvature doesn’t cast any rays but evaluates neighboring topology :slight_smile: Apples and bananas)

Thank you both for clarification, this kind of stuff is always useful to know. Definitely +1 from me to raise the attention of developers.

This is the dev page for the bevel patch. In it, Brecht specifically mentions wanting an AO node that can output color to a shader node tree, but there is no specific resolution on that topic.

https://developer.blender.org/D2803