GLSL Shader nodes for procedural snow generation?

Could anyone point me to a BGE compatible shader node setup that does something similar to this:

Thx!

*** Moderation ***
moved from Game Engine Resources
because it is a support request
*** End of Moderation ***

Out of curiosity, is there a strong need for this to be real-time? It would probably save a lot of render cycles if the effect was simply baked in with geometry and spec/diffuse/bump/normal maps.

Actually yes, as I have a cool gameplay idea that would need to dynamically modify that shader.
Besides the idea is to be able to use multiple copies of the same mesh in different direction in the game and each time have the different snow pattern give it a unique look automatically.

But as the above example is realtime too, and such shaders have been use in commercial games already quite a bit, I don’t see why it can’t be realtime :slight_smile:

I didn’t have time to watch the entire 14-minute video. Do they provide the GLSL source for the shader they are using?

Sadly they don’t, or at least the link to the UDK sample file (which might or might no include the shader source in an easy to get way) is non-functioning.
A google search game me a lot of similar examples and papers explaining such techniques, thus I guess it should be possible to get at least some sort of shader source file (but not necessarily GLSL).

What about using a displacment on each “clump” that is made from a copy of the top of each “mesh chunk” extruded slightly?

averaging the height of the edge sections together, would “blend” the sections…

???

you could use “Top down” ray casting to see which faces are facing up, or just use another material on the top faces to ident which faces will be “snow generating”

:smiley:

unless the snow is made from distorting the geometry that is there, I don’t know how to do it,

if I load in a mesh, can I join it in game?

Smooth shaders don’t like my method…

Attachments

SnowBlend.blend (454 KB)

If you can find some kind of source for this effect then that is the place to start. It doesn’t matter if it is HLSL (DirectX), GLSL, or Cg, it is relatively easy to convert one to the other.

It’s just altering the texture, no geometry displacement at all. Or did I missunderstand what your problem is?

http://www.zspline.net/blog/deposit-shader-in-mentalmill/
Might help? It’s even CC-by-SA :slight_smile:

From the Article: “First the normals of the surface are converted to world space. […]”
That is the Problem: The Blender Material Nodes don’t offer World Space Coordinates.
(I also wanted to make the very same Effect, opened a Thread on it, and had to find this bitter Truth.)

But maybe the Video on the UDK Material is more helpful as it describes what sounds more like a Workaround. Maybe that can be translated to Blender Nodes as well.

Hmm, yeah that sure is a useful bit of (rather unfortunate) information. But the UDK implementation seems to be the more efficient one anyways, so maybe that could be replicated. Dunno… but I might try it out myself (but I am a total shader n00b).

Ok, I played around with it a bit, but I can’t find an option to input the “object world position” as it is called in the UDK editor either. It seems like materials in blender are totally “unaware” of the state the object they are covering is in :frowning:
However as I said, I’m a total beginner with that stuff, so I might have missed something.

Edit: Ahh, it seems like world-normals will be supported in the new candy-branch that is currently under development :slight_smile:

Yay :smiley:

Score another 1 for the moogle :smiley:

That seems odd to me. Usually, you wouldn’t pass in a world-space coordinate anyway. Vertex positions are usually specified in object-space and normal-maps are usually specified in tangent-space. The ModelView matrix is passed in to the system and this lets you move the other coordinates into world-space.

I think to do the normal-mapping you need a few more matrices, but since Blender can do normal mapping in the BGE, then these have to be available to the system.

The worldspace normals are used to determine if the area is facing up on the model, as only those are to be covered with snow… as the effect is supposed to be dynamic, e.g. you can rotate the mesh in any way and it will always only cover the top areas in the “world space”.

Obviously Blender knows about world-space normals internally, but it isn’t exposed through the material node yet.

Ah, I figured that this was going to be a full custom GLSL shader as opposed to trying to do this with the Blender material nodes.

I could do that with rays right now :smiley:

If Ray hits change material on face,

http://www.tutorialsforblender3d.com/GameModule/ClassKX_PolygonMaterial_1.html

or at least I think it’s possible to use this + polyProxy and rays do “rain snow”

have 3+ levels of “whitness” for each face on a uv map,

apply the uv mapped texture 1 face @ a time

:smiley:

I use polyProxy for the 3d logic system I just got up and running… but I link each face material to a property to pass data instead :smiley:

Space bar = 3d Keyboard Sensor

all nodes are technically Controlers

Spawned cubes = Edit object /add object actuator

Attachments

3dLogicNodeRotate.blend (900 KB)

Hmm, sounds like a computational expensive solution that doesn’t only use GLSL on the GPU.
It’s cool that you found some sort of work-around though.
For me I am rather waiting to implement it with world-normals.

the thought here was each snow flake could be randomly generated (on a “cloud plane”

Snow spawn - cast ray down
Distance / fall speed
after X frames

color = color +.1,.1,.1 on face that was hit…

it would at least be intresting :smiley:

really snowing in a game…

you could not sink into this though…

have the sun hitting the texture doColor = color +( -.01,-.01,-.01)