Here’s my implementation of Eric Lengyel’s halfspace fog (link to PDF article) originally published in Journal of Graphics Tool’s, Vol. 12, No. 2 (2007).
The GLSL filter is paired with a script that updates the uniforms, as the filter requires the camera position and orientation to properly render the fog in the world space.
While it may look as only a height fog, it is called halfspace because the fogged space can actually have any orientation:
The position and orientation is given by the empty containing the filter, as it would be expected.
Made in Blender 2.79, works with UPBGE 0.2/0.3.
The option ‘UseMistColor’ allows using scene.world.mistColor as color source for the fog, this is convenient for changing the color in-game, rather than changing each RGB game property manually. By default, the mist color is the same as the background/horizon. This option doesn’t work in UPBGE 0.3.
Now it is possible to change the color of the fog with the color of the empty:
FogEmpty = scene.objects['HalfspaceFog']
FogEmpty.color = [0.4, 0.5, 0.6, 1.0] # RGBA, alpha is required, but not used.
It is initialized with the game properties ColorR, ColorG, ColorB.
WOW… this is really cool.
I guess though there’s no start or fade in this type? Meaning I can’t set how far from the camera the fog starts at.
A kind of best of both worlds scenario, that what the UPBGE_0.3_glsl_mist does, but with height control like this one.
I don’t know if it’s possible that kind of setting with this fog. Maybe try enabling Squared and set the density really low, like 0.1. I’m personally not very fond of a fog that starts at a certain distance from the camera, it makes me feel I can’t never reach the fogged area.
Just getting some time to play with this… I think the fog density value works great for the distance from camera thing. This looks so good… can’t wait to drop it into my project. Looking forward to sharing it with y’al.
Thank you for these! Looks much smoother than using Volume Scatter (which always flicker, at least on my PC)
Do I understand it correctly that this is a kind of elegant workaround for the Fragment- and Vertex-shaders like they used to work in previous BGE?
It’s likely that Eevee uses raymarching or a similar method for volumetrics, this filter is not volumetric, but it’s called fog volume because the fog is limited to the volume of a shape. This is how it works: the tracing functions traces the shape with a ray from the camera and return the entrance and exit intersection points. The distance between the exit point, or the geometry if it’s closer, and the entrance it is used to calculate the fog; if the distance to the geometry is less than the entrance point, which is always greater if the ray doesn’t intersects the shape, there is no fog.
A volumetric fog (sun lamps shadows only) with the traditional ray marching technique.
These are the game properties you should pay attention to:
Sun Lamp: the name of the sun lamp whose shadows you want in the fog. Beyond this distance, the fog will stop existing (more on this later).
Max_Distance: the maximum distance from the camera to ray march the fog. If the sun follows the player/camera, you shouldn’t set this to a greater value than the shadow visible distance in front of the camera.
samples: this determines the quality of the fog; more samples = better quality, worse performance. It’s the amount of steps or layers used to make the fog, dividing Max_Distance by samples gives you the distance between steps: 25.0 / 70 = ~0.36 meters or BU. The steps are dithered with a random noise to reduce any flickering, more samples make the noise less visible, but if a portion of fog between shadows is smaller than the step size, the noise will still be visible.
Full fog: enabling the option will extend the fog beyond the maximum distance with a basic ranged fog. This will cover the sky/background, more or less depending on the clip end of the camera. Alternatively, you can use the default mist and set the start to the Max_Distance value; you may need to experiment with the falloff type to preserve the continuity between both fogs.
The color of the fog is given by the color of the material of the cube object, but it is also affected by the color of the sun lamp: a green sun will tint the fog green.
The mie scattering can be disabled of adjusted with the corresponding properties.
For performance reasons samples, Mie Scattering, Full fog and Debug fog are “hard coded” in the shader when it’s first created and won’t directly affect the fog if they are changed in game. To apply the changes you need to execute the script again, resetting the Always sensor will do.
It supports simple and VSM shadow types. The resolution of the shadow buffer affects the performance.
About UPBGE compatibility In order to get the shadow buffer working in the shader, it’s necessary to bind it with the openGL wrapper module (bgl). In UPBGE 0.2, the filter works in the embedded player, but in the standalone player, glActiveTexture, essential for the buffer binding, raises a TypeError: 'NoneType' object is not callable, meaning the filter won’t work. Maybe it’s just me, but I get this on Windows and Linux. Overall, I found the bgl module to be less reliable on UPBGE, that’s why I do practically everything in BGE. Check the next post for a UPBGE 0.2 compatible blend.
In UPBGE 0.3, you can’t get the shadow bind id through the sun lamp, shadowBindId was probably removed with the game engine and since sun lamps use cascaded shadow maps, there’s more than one shadow buffer per lamp.
So I remembered that UPBGE had a filter manager, which allows binding textures with the bind ID without using the OpenGL wrapper, thus making the filter fully supported in the standalone player. Check the previous post for all the info about the filter.
As stated, it’s only for sun lamps. Spotlights use a different type of shadow matrix: sun lamps use an orthographic projection, while spots use a perspective projection. But even if the matrix is changed to the right type:
conventionally in BGE, they talk about “environment lighting” for that global base lighting and there’s yet, no such thing as ambient lighting which is just indirect lighting (light bouncing on objects). (Maybe i’m wrong ) I ask because i see that you can do more complex things that simply applying an artistic 2d effect on the rendered picture … which were for me, what they were only able to do .
I’m very looking forward to see how far UPBGE can be hacked to achieve good looking results with good performance (play with layers, 2d filters, material nodes, baked textures, python shaders … etc)
I will be happy the day one can upbge, range… reach performance/quality of a game like the Stanley Parable ( many lamps and vertex in a scene)
A while back I tried implementing the Reflective Shadow Map technique for indirect lighting. I sort of got it working but it was really slow, unlike Ambient Occlusion where you sample with a rather small radius to get occlusion in corners or near objects, with indirect light you want it to travel a little bit further, that means a bigger radius that results in worse performance and more samples to get rid of the noise, and it’s only for one light source. It was rather a disappointment, or maybe it was my basic implementation’s fault.
Later I checked the indirect lighting with Radiance Hints, which is used in Tesseract, it is based on the reflective shadow maps method, but requires rendering 3D textures, and while it’s technically not impossible to do that in BGE, I don’t know if Python and 2D filters is the best way to implement it.
It appears to be a macOs related issue. While texture2D has been deprecated, its usage is usually met with a deprecation warning, not a full-blown error.
Anyway, try replacing texture2D with texture in the shaders.
Do you think you can merge your one with the lightscattering 2d filter ? Yours goes with the sun direction which is more realistic and do not give awkward results (unlike the light scattering that is camera projection and so moves with the camera … which is cool in some lesser extend but will look stupid if your characters doesn’t move but your camera do) …
but do you think you can use somehow the sky dome normals to make , with your shader, the same job than the light scattering ?
Note that the filter is the implementation of the Volumetric Light Scattering from NVIDIA’s GPU Gems 3. It actually requires the position of the light in screen-space, but it wasn’t added to the filter, the position coordinates are hard coded.
For my volumetric fog, I use the shadow matrix, that’s why the shadows are properly aligned, but I do use the position of the sun projected in screen-space for the Mie scattering.