Hi guys. So I’m making a stylized grass shader for a scene and it works great in eevee but not for cycles. Basically the workflow is to distribute a bunch of planes on a terrain using geometry nodes, sample the terrain normals and output them to the shader. But these custom normals are not working well in cycles. The interesting thing is that for each face, one side always gives the correct result while the other side does not. I thought it was just a backfacing issue but it is not. For some planes the front face is shaded correctly and for others the back face is correct.
Does one know if its possible to make the custom normals behave the same way in Cycles?
I’m uploade some images and the file with my issue
What you’re seeing in Cycles has nothing to do with normals and everything to do with the fact that Cycles uses indirect light bounces between objects to create global illumination and ambient occlusion. Objects close to each other, like your planes, shadow each other in Cycles by nature and design. You’re going to have turn your max bounces to zero for everything in your Cycles lighting settings to get the same effect
Thanks for the answer. Do you mean these values or there is something else?
Even at zero the planes still have the same appearance.
I can be wrong but I don’t think it’s the bounces because they actually would surfaces brighter and even ambient occlusion wouldn’t make the planes so dark (looks like they are completely black from at least one side, some of them are black on both sizes). And I tried making the density very low to make the planes farther apart so one wouldn’t affect the other and the issue persists.
edit: for me looks like cycles handles custom normals in a different way and for these planes I don’t know why it isn’t working. Even the faces that aren’t black are still shaded in the wrong way.
Zero bounces don’t include the first bounce from the visible surface to the light.
And the problem with custom normals, is that they still are limited by the geometry, and if a custom normal produces a reflection vector that will hit the geometry itself, then you’ll get a shadow (as the ray never hits the light).
So, in sum, there’s nothing wrong with Cycles… You’re just asking the renderer to do exactly that.
On the other way, if you bypass all Cycles illumination algorithm (not using lights and raytracing at all), it works as expected.
Here’s the same scene in cycles, but all objects have a fake illumination (and material), by using a MatCap instead.
So if I understood right, basically these custom normals are pointing to the surface itself, then the geometry occludes itself? So basically achieving the same effect in cycles would only work by ignoring all lighting. Thank you for the explanation, it is very interesting and I would never have thought of that. I’m making a scene that I would like to have a working version for EEVEE and cycles and looks like I’ll have to use a different method on both of them.
It’s just because both engines do completely different things…
While Eevee is more about painting triangles in a canvas, Cycles is more of tracing rays in a 3D space and accumulating the results in the pixels of the canvas.
Some things are allowed to be done in either one, but as they deal differently with all lights and geometry aspects of the scene, some effects that work in one engine won’t work in another.
Yes, I understand the differences between a path-tracer and a rasterizer. You answer makes sense but i still struggle to understand some cases. For example, I did another test with just a plane, turned all bounces off and this happens:
If the normals are pointing up, why doesn’t both side become black? Its the front face that is occluding the back face?
edit: actually if I plug a combine xyz node in the shader with the z set to 1, both faces become black. I don’t know why the data transfer modifier doesn’t give the same result since the surface is perfectly flat.
When you use a Normal vector that’s perpendicular to the surface, then most of the functions for a ray tracer might work unexpected.
The materials rely in a conjunction of factors to work as we want, and that is the geometry, the surface data, etc… When those are not correct, a single millidecimal is enough to throw the whole calculation into garbage.
Just for example, most closures use the dot(Ng, Nt) as some sort of factor; but if Ng (geometric normal) and Nt (textured normal) are perpendicular, that dot product is 0. If down the algorithm, the kernel needs to divide some value by that, you get a problem (nothing can be divided by 0).
To make things even more complicated, there are all kinds of floating point problems everywhere, and some times a value cannot be expressed inside the machine, where it has to peek some other value that’s similar, but can be illegal.
Cycles has plenty of security in place to avoid most situations, but when you force them in your scene, then there’s nothing it can do. (the same way, my handy doesn’t work underwater, but still I take it for a swimm).
“there are all kinds of floating point problems everywhere” I was thinking exactly about that possibility for the normals perpendicular to the surface.
You have a very good technical knowledge. Are you a programmer?
Yes, I am. But i normally work with totally unrelated stuff (mostly front and backend intranets).
And the floating point problem is happening in your example also. If instead of a [0,0,1] vector you use a [0,0,-1] as a normal (which makes even less sense), you get some illumination (thought not in all samples)!