shoreline with mesh/global distance or pixel depth offset for water

Hello,

It is possible create shoreline in Blender like in unreal engine global distance or pixel depth offset features?

I seen a possible solution this video but it is made with dynamic paint that is not good for me:

rather i would like create like this…

shore line with global distance in unreal engine :

global distance field image data


shore line with distance field


shore line with Depth offset in Unreal engine with bluerprint node system.


https://www.fantasyarchitectures.com/pixel-depth-offset
create shore line in 3d max:

Each render engine will require different methods… (All fake, if you don’t fully simulate the fluid interaction with other objects).

So in order to a better answer, you need to tell us which engine are you planning to use.

Hello Thanks for answers,

I would create this shoreline effect with cycles or eevee render engine with blender node system without using dynamic paint or particle system. I think maybe possible create with node system something similar as in unreal/3dmax/or other sw.

In Cycles, if you really want to avoid dynamic paint, you need to use OSL, trace rays to check nearby geometry and take action upon the result.

For Eevee, dunno… it’s not finished yet.

If you need a really cheap waterline effect, you can just use the Z-coordinate from the Geometry > Position node. It won’t really work if objects move out of the water, or if your water is animated. But if you just want a wet effect on static objects near a static waterline, this type of setup will do the trick and has very little performance cost:


(add node will change the height of the waterline, the multiply math node adjusts with width of the waterline. You can also do that with a ramp node if you prefer)

@J_the_Ninja, unfortunatelly that will only work for the objects that ‘touch’ the water, but not for the water itself.
For making some kind of foam, waves or ripples in the near of other objects, the best way would be to use Dynamic Paint… But OP doesn’t want it, so OSL is it’s only possibility…

^^ could you not use a object to drive the foam masking (like a displaced plane which intersects with the water at certain points). But set the displaced plane to not render.

Thanks Guys for help.

I would prefer use Eevee engine and use only node system, with moving object and animated (mesh with normalmap) water (not OSL and scripting) .I find in Eevee roadmap doc the pixel depth offset feature will be implemented (or maybe is done?) , maybe this is an options for me?

https://wiki.blender.org/index.php/Dev:2.8/Source/Viewport/Eevee/Roadmap-SeptDec-2017

I don’t know how exactly pixel depth will be implemented, but afaik, you won’t get nearby geometry distances; only the depth difference between the pixel you’re drawing and the pixel right below it.
This will be a view-dependent effect, giving different results from different points of view, and probably with some surprises depending on the way objects are sorted for drawing… So, most probably, you’ll need to use dynamic paint, or some other pre-render computation (bake), in order to get what you want.

Again, as Eevee is still under (heavy) development, any support here is discouraged.

There are a lot of things cycles simply cannot do without pre-computing or baking. Whenever I have to do something like that I usually have to manually paint my own mask. Whenever I use dynamic paint, it’s either too slow or locks up.

If I have memory to waste however, I just use the point density texture to read the verts of the object, a decent radius and a little noise go a long way.

Obviously the wasted memory is extreme, since we are using it only for a 2D surface and not a 3D volume, but whatever works. In fact I have had a pretty good workflow of baking the output of the point density to a mask texture.

Would be nice if we had some kind of “geometry intersection” node with a distance gradient/radius.

:confused::confused::confused::confused:
Now that you mentioned… we sort of have it.

The Bevel node does more or less that: throwing rays in any direction for averaging the normal vector from the hits! If we dot product that with the original normal…:eyebrowlift:

It’s not perfect, and it won’t be really a distance to near geometry, but a difference in nearby normals (specially from a plane) indicates that there is nearby geometry.

The only thing is that all meshes must be joined in to the same object, because the Bevel only uses a private BVHTree. But we can still have different materials for each joined mesh.
(this can however influence the results of other shaders that use the same BVHTree, like SSS)


1 Like

Which is another nice work-around.:eyebrowlift2:
But again, there’s always a catch… normals instead of distance and must be the same mesh. sigh

Nice progress :slight_smile:

another thought: We have real-time ambient occlusion in Blender by default in 3d view property panel at the shading options.
It is very similar thing that calculate the near distance to other surfaces and make a dark shade on the surface. If we have an ao distance node and with this data we can set with a gradient color/texture?

similar in signed distance field AO in unreal:

https://docs.unrealengine.com/latest/INT/Engine/Rendering/LightingAndShadows/DistanceFieldAmbientOcclusion/

Yup… and that makes it particularly difficult to animate!!

That’s why OSL is (for now) a better solution, as not only we can throw rays in the direction we want, but also have specific set of objects to query and get positions, distances, normals, or other attributes from the hit. We can even have some fancy optimizations in the end result.
Thought as raycasting while rendering is still a heavy process, it’s still better to bake that info and use the bakes for the final render.

First: It’s not a progress. It’s a clumsy hack (a method I personally would never use!).

And that paper won’t work with Cycles (Cycles and OpenGL simply don’t work the same way)
It could work with Eevee, if someone implements it in the source code… but it’s too early to say.

I am kinda hoping that with eevee coming we could have an easier way to do such things. Like easy input of eevee outputs to cycles, for example some of the screen space stuff/shader results (not compositing, but texture access). It may not be physically accurate, but as unreal has proven, the rendered result is what matters.

I never use OSL for rendering because it’s so damn slow, however using for baking maps ahead of time is fine.

In the end it would still be preferable to have built in “mesh intersection distance” as an additional option to “normal difference.”

It’s possible to use ScreenSpace in Cycles (both SVM and OSL)… Actually, some OSL stuff is only available in screen space, like the Dx() and Dy() functions; and it’s always possible to transform any coordinate system from and into ScreenSpace (with the Vector Transform node in SVM and the transform() function in OSL).

I never use OSL for rendering because it’s so damn slow, however using for baking maps ahead of time is fine.

Yes… unfortunatelly OSL is a bit slower than SVM alone in CPU. But there are some tricks to speed it up… The most important thing is to pack all textures into the blend file!! This makes quite a difference in rendering times (thought still slower than the SVM counter part)

In the end it would still be preferable to have built in “mesh intersection distance” as an additional option to “normal difference.”

That could be done… The code is not that different from the bevel node (or even from the AO shader), and in practice, retrieving a normal or a distance doesn’t require nothing fancy. The only problem with the current algorithm, has to do with faces that go behind other faces, sometimes producing unwanted results… This is something that requires a new method, but probably it will be slower.