Screen-Space Global Illumination Filter v1.5

I mean no disrespect, but this comment is pretty useless.

Besides the fact that averaging all the items in a list is something i already know how to do (your code does seems a bit odd though), I said that I don’t know much about blender, not about coding and machine logic in general.

This is what i understand from your code:


#define [I]Samples[]</i> (idk what exactly goes inside it though)
Samples = 
[list of samples] 

#if the current object doesn't have the property <i>"colorSampleList"</i>
If "colorSampleList"  not in own:

    #Then define a property of <i>own</i> called <i>"colorSampleList"</i> as equal to <i>Samples[]</i>? idk what <i>[Samples]</i> means though
    own['colorSampleList'] = [Samples]

#if the current object does have the property <i>"colorSampleList"</i>
else:

    #Then iterate through all elements in <i>"colorSampleList"</i>
    for i in range(len(own['colorSampleList'])):

        #and set each one of them to the average between itself and the element of <i>Samples[]</i> with the same index
         own['colorSampleList'] <i> = (own['colorSampleList'][i] + Samples[i]) / 2

# why do you average like that? shouldn't it average all the samples in [I]own['colorSampleList']</i> into a single variable? maybe I'm just not getting something


For instance: i have no idea where that code should be (i assume it should be in a text datablock referenced by a python logic brick though).

I also don’t know how to translate and store the result that my filter spits out into something that your code could refer to as “own[‘colorSampleList’]”.

And, as a mainly non-Python programmer, i don’t really get this line: own[‘ColorSampleList’] = [samples], i get that you are setting the property ColorSampleList of the current object, but i don’t understand what you mean by [Samples].

If you could clarify that for me i would gladly implement it.

Thanks for your time!

i don’t really get this line: own[‘ColorSampleList’] = [samples]

The 1st line Samples

it sets the property with the Samples(default values) list that you created on line 1

On frame zero, there is no list to average against, so instead, you take your samples and set the list directly from the samples you take on frame 0

After there is a list existing, you can average the sample against the previous sample.

(I assumed you were taking multiple samples per frame)

So for each sample, you average the sample against the previous sample from that point on screen.

It seems to me like you misunderstood what this program does.

On every frame and for each fragment, many samples are taken and they are added together, these samples come from VPLs (Virtual Point Lights), this is a pretty decent approximation to Indirect Light.

I am looking for a way to store the Indirect Light of any frame (wich is the result of these calculations) in an image and keep it around for a few frames.

Then, i want to average the Indirect Illumination of the current frame with the Indirect Illumination of previous frames, and add that to the rendered image.

This should help get rid of the flickering that occurs due to low sample amounts when moving the camera around.


Now for my issue: I don’t know how to store a frame in an image, and i don’t know how to do post processing to stored images.

I hope this helps explain my program.

Video texture module can store images I think*
So can PIL

You need to use a image buffer I think*
https://docs.blender.org/api/blender_python_api_2_76_0/bge.texture.html

Hi SebastianMestre,very interesting approach for indirect lighting. I see a huge potential. It looks fine for static scenes, but because the samples are static (relative to viewer) there is a lot of aliasing or flickering happening. Also the samples are gathered evenly on the view surface so when projected on a horizontal surface in perspective, the intensity decreases. This causes an uneven distribution of reflected light intensities - closer surfaces are bighter than in distance.
I think it could be solved with a system that would distribute the samples based on camera movement and reduce the intensity based on linear depth buffer.

Oh thank you!

I already have a solution for perspective related issues, i’ll put it up tomorrow.

As for the flickering i have an idea but it will take me some time to implement.

Cheers!

Are you still working on this? It has a lot of potential!

Here is a little experiment: indirect lighting fully raytraced in screen space


it is hella slow and full of choppiness but it can light up entire scenes and produces both indirect lighting (i.e from an emisive plane / wall with a light being shun down upon it) and soft shadows from said indirect lighting.

this is just a silly and useless experiment but a similar technique could be used to add shadows to the existing VPL based filter to make it look better. It would still not look as good as raytracing but it would be much, much faster.

Also. This filter, unlike the VPL one, is temporally coherent (a.k.a. no flickering apart from some noise shifting around the screen).

Let me know what you think.

My main project right now is screen space reflections. However, i do still work on this from time to time.
And although school does take priority over anything else i do, especially now when we’re going through finals week, i do find a few hours each weekend to work on various projects. and vacations are coming soon so i will have plenty of spare time.

That said, i found some time and updated this filter the other day. now it is less physically correct in a sense but it can look much better. Also, i added more proper perspective correction.

the filter is not working properly

in worldmap settings (earthglobe, rightside ->) did you set…:

since this uses samples that tend to flicker, can it be improved using the sample data from last frame?

here is an example of a offscreen buffer

edit:
we should have render attachements in UPBGE soon also*

your screen space reflection shader should get a significant speed boost.
Do you want to work closer with the upbge devs?

Ohh that is certainly something i have to look into. Thanks for the link!

I don’t know about working with the UPBGE devs, I’m honestly not very reliable so I think I would just slow down anyone that intended to rely on me :stuck_out_tongue:

yes i did

Not sure if it is related to your issue but it seems i messed up gamma correction big time.
I just noticed this today so bear with me while i upload a fix

EDIT: all done and dusted. Update is up. It should maybe run faster and look a bunch nicer.

I got one problem

you can see the polygons

whenever my character moves the GL is just glitching

Yes. I am aware of that. That is just an intrinsic property of how this filter works. It reconstructs normals from a depth buffer, so surfaces look “flat shaded”. In the future i guess I’ll be targetting UPBGE so i have to look into if UPBGE provides some means of getting a normals buffer which i can sample directly (this would also improve performance tremendously. It could run up to (and roughly) 3x faster)

As for your second issue, i know about that. It is just caused by the amount of samples available to the filter. Its effects can be alleviated by increasing the sample count or through some temporal filtering, I just haven’t gotten around to implementing it. Again. If i could get access to mipmapped colour and normals buffers this could be further alleviated

https://github.com/UPBGE/blender/tree/ge_render_attachment/source/gameengine

this is do to be applied to master this release

1 Like