If i remember correctly, i included a refine function in the shader (cant remember if it is used though), it does a binary along the ray to improve the result when a hit is registered. As a first step, that reduces staircasing quite a bit.
Note that a hit needs to be registered for this refine step to take place so edges on geometry will stay looking like a staircase, it is smooth surfaces that will have discontinuities reduced
I went all out and implemented GGX sampling and Cook-Torrance brdf but the cheaper Blinn-phong sampling and brdf will probably look not much worse.
Using a geometry-aware blur is the most common way to reduce noise so that can be used.
The raymarching algorithm i used is honestly quite naive, there are many optimizations that can be done (multiplying the stepsize by the rayās end current z coordinate on each step is an easy one that reduces the amount of steps needed to hit something and reduces redundancy of pixel sampling) but there is no way to do this a lot better than it is (it might run two times faster at most)
State of the art for raymarching height maps (or in this case, depth maps) involves making max-mipmaps of the height map on each frame (which is not something that can be achieved easily or in a performant way on blender as far as i know) and doing a binary search on that.
This algorithm is many times faster and gives way higher quality results than what i have implemented (i believe that it is what most AAA games use nowadays)
Those are pretty much all the improvements i can think of right now.
To be honest i had kinda abandoned this but you guys made me want to come back and continue working on it, so you might see an update with some of these changes in the next few days
GGX sampling sounds great. Much better than Blinn-phong.
You can use mipmaps in GLSL filters in UPBGE.
Also, you do want to use 16-bit normal for optimal quality and less stepping. https://github.com/UPBGE/blender/issues/718 if this gets done, you may see get an way to get normals in high bit-depth.
That sound like a great idea but, since I am a student and training for IOI2018, I donāt have much time to keep up with development.
Also, I wouldnāt want to hold anything or anyone back with my sloppy code.
I agree, it can be a bit more expensive though.
Does that include max mipmaps (i.e. each sample (pixel) in a mipmap is the maximum of the samples it covers in the previous mipmap) or is it limited to only average mipmaps?
If the former is true Iād be pleasantly surprised!
Thanks a bunch for your detailed reply, Sebastian. Even though I have been programming graphics for quite a while, this is the first time I have used ray marching. I always wanted to try my luck with SSR and your shader seemed like the perfect way to learn more about it.
I know about AAA screen space reflections using fallbacks like SH probes and parallax corrected cube maps and I will probably try to implement them as well, but they are notoriously difficult to get right. First, though, Iād like to improve this shader as much as I can and more robust ray marching is a good first candidate for optimization. Iāll see if I can find more documentation on the techniques you described. Thanks for pointing them out.
Also would love to see some of the changes you have in mind
Awesome! I really should have thought of making a Sponza render!
I might be wrong, but it looks like you didnāt match the settings to your camera. It looks so much better when you do! You just have to change the numbers on lines 18, 19, and 20. (Yeah, pretty annoying, I havenāt figured out a way for it to do it automatically)
In official blender bge, you can pass integers and floats with game properties. For example, you can add a game property named znear and set it via a python script. Else, in upbge you can use BL_Shader API to send any uniform types to 2D filters, you can also send textures (But all of this is also doable with bgl in official bge). Iāll retry setting correctly other vars (sky colorā¦); but just copy/pasting the script and pressing P, it already gives pretty nice results : )
If you press play, the camera will move in the scene (pingpong animation)
I have set the parameters of the cam to the ones of the actual viewing cam
I think something is wrong with reflections. If you look at the reflection of the above visible test image on the floor, it is weirdly moving upwards. This doesnāt look right to me.
I am running BGE 2.79 b
The reflection on the right wall looks cut off - I have been tweaking some values without effect
It doesnāt seem to work well with objects which have āsmoothedā on (see red icosphere in scene)
Maybe I have set some values out of range or so - maybe you can have a look and tell if I am just using it wrong.
I tried to have only the ground to reflect using ImageRender and upbge 2D filters API but we can see (a bit) the sky reflected too in other parts of the scene.
After, to have a real roughness map or a specular map, maybe it is possible with Render Attachments patch from the main upbge developer in 0.2.4. It might also be possible using some hacks with ImageRender I guess.
Oh, wow. That looks off. Iāll look into it. It might be messed up since I changed the raymarching code and I wasnāt all that sure about the mathā¦
EDIT: Hey! I looked at your file and I know what the problem is! Itās not really explained anywhere, but this filter expects fov to be given in degrees. It was just a matter of changing 35.0 to 49.1 and it was better.
But that is not the only problem. Fov in general is not too well handled in this filter, if your fov is a lot smaller than 90 degrees, it gets a bit distorted. (The reason is Iām using an approximation instead of the proper maths in one of the functions, which causes everything to skew a little bit.) It is something I have to fix in the future.