using Depth in custom shader

Hello everyone,

I’m using bge.Texture.ImageRender module to render a scene to an fbo, and I would like to use some depth of field. I’ve been searching and found some scarse information, and no good examples… What I would like to achieve can easely be done in a 2DFilter with bgl_DepthTexture if you render the whole viewport… lots of good examples, like Martins Upitis ones.
What I’m looking for is just for one object with a glsl shader.
I found this bit of code, but I don’t actually know how to pass the information to the shader to blur the image depending on the values obtained…

render = VideoTexture.ImageRender(scene, camera, fbo)
render.depth = True
VideoTexture.imageToArray(render,‘F’)

Does anyone know a better way to do this?
is it possible to pass the bgl_DepthTexture to the object shader somehow? I’ve been reading about glBindTexture(GL_TEXTURE_2D, bindID), but that only binds textures to the shader…

should I render Depth to a texture (and loose lots of precision) and bind it like that? or can I use the full float in the shader?
sorry for some confusion (newbie programmer) and/or english errors (not native speaker)…
Thanks in advance
Nuno Estanqueiro

It seems my first post is a bit confusing… maybe I can upload an example to see if I can get some help on the subject…

In the mean time I figured out that I can initialize the fbo as a texture (with an ID to bind to the shader), so half of my problem is solved, but I’m still missing the depth part…

when I do the ImageRender() I set zbuff to true (instead of depth=True, to be usefull as a texture) but I don’t get a gray scale image… I still get the full RGBA… should I be doing something else to the fbo? It’s probably very obvious, but I don’t see what I’m missing…

The example has two scenes, one scene called “alpha”, that I render to FBO and apply to the shader (altough as rgba) and the main scene, called “Scene”, with just a Plane with one material with two textures.

First strange thing is that if you turn the shader off, you don’t see the texture on the material updating… and second, on “alpha_render” script, altough zbuff is on at line 12, I still get RGBA on the texture, and therefore cannot use it to do DOF…

thanks in advance
Nuno Estanqueiro

Attachments

customTex2Dfilter_render_scene.blend (152 KB)

Well, It actually seems like a bug in blender, or I’m really missing something…

the example breaks as soon as one adds the uniform sampler2D bgl_RenderedTexture to combine this with the rendered image, giving an all black image instead of the rendered image… I call it a bug because if you turn on framerate and profile on the gameengine, it shows the rendered image, but all letters are messed up… should I upload the example with the bug?

If anyone wants to take a look at this, just add
uniform sampler2D bgl_RenderedTexture;
at the begining of the shader and replace the last by
gl_FragColor =texture2D(bgl_RenderedTexture, gl_TexCoord[0].st);

then compare the results with both stats on and off…

For 2D filter (shader) there is a builtin variable for the depth texture.

uniform sampler2D bgl_DepthTexture;

There is also


uniform float bgl_RenderedTextureWidth;
uniform float bgl_RenderedTextureHeight;
uniform sampler2D bgl_LuminanceTexture; // e.g.for bloom
uniform vec2 bgl_TextureCoordinateOffset[9];

Also if you want to visualize the depth buffer, you need to linearize it.
I have quickly written this code I don’t have tested it.


uniform sampler2D bgl_DepthTexture;

float LinearizeDepth(vec2 uv)
{
    float n = 0.1; // camera z near
    float f = 100.0; // camera z far
    float z = texture2D(bgl_DepthTexture, uv).x;
    return (2.0 * n) / (f + n - z * (f - n));    
}


void main()
{
    float d;
    if (texcoord.x < 0.5) // left part
        d = LinearizeDepth(gl_TexCoord[0].st);
    else // right part
        d = texture2D(bgl_DepthTexture, gl_TexCoord[0].st).x;

    gl_FragColor = vec4(d, d, d, 1.0);
}

Thanks HG1, I know those uniforms already, and all is fine if you don’t use a rendered texture of a scene as one input…
I have some working examples with the filters working, including the linearized depth… but I need to add one scene render as an input on the shader, and that is where it breaks… because one gets only black instead of the rendered image… have you downloaded the example? can you please do so to check that i’m not misinterpreting something, or doing some rookie mistake?

The effect I’m trying to achieve would be similar to 2 render passes… one with the objects of the background scene, and one with the objects of the other scene, with some other 2Dfilter passes applied to both layers… if you use an overlay scene to do so you have to repeat all the 2Dfilters on the overlay scene, and that slows thing down a lot (I’m using DOF, lens distortion and some other videoFX)

If you download the example and change the shader code to include the bgl_RenderedTexture uniform, please check with frameRate and profile On and Off to see if you experience the same issues I do…

Thanks for replying

A while back I did this, forwarding a depth render-to-texture into a node shader so I could get funky mist colors:

Sample blend is included in that post, tested working in 2.78c
There’s some extra stuff in the code, but you seem to know your way around python. Let me know if you have issues.

Multiple passes is a lot of fun. At work I have a setup in UPBGE with seven or eight of them. It means we can more easily control post processing effects, and pass heaps of data into a super-shader…

Thanks… I’ll examine your file with great attention… Interesting the way you use nodes…I keep forgetting about them, and use glsl shaders instead… and it’s so much easier with nodes… I’ll try to have a go with your file as basis, but I notice one frame delay in your render and overlay… only evident if you move too fast… I don’t think it will be a problem, because I think I’m gonna have to render the scene as well… so both will be in sync…

but it seems like a better way to go… much less prone to errors like the ones I was getting in the uploaded file…

I would love to move to upbge, but not all things needed have been ported… It still lacks the Decklink module, so it’s a no go… But I’ll keep an eye on it!
thanks for replying!

Yup, there is always a one frame delay in a render-to-texture. One solution is to have the actual view also done through a render-to-texture, this way both are delayed by one frame. Doing that also means you can apply normal fragment shaders to the view.

I would love to move to upbge, but not all things needed have been ported… It still lacks the Decklink module, so it’s a no go… But I’ll keep an eye on it!

UPBGE started as BGE, but they added/removed things from it. I suspect they thought no-one was using Decklink and removed it. Perhaps ask them to add it back in (either on the UPBGE issues tracker or on #upbgecoders on freenode)

About what features are you talking about not been ported? Im confused here. Upbge is just bge, but with optimizations and clean code. Im not an upbge dev, but i build and test it too, and i see Decklink in the cmake files.

The UPBGE binaries I downloaded did not contain the decklink module, hence my files would not run… It’s been a while since I last compiled blender. Ever since the inclusion of the Decklink code in Master, I no longer needed to compile blender from source… Maybe it’s time to download the sourcecode and try to compile UpBGE with the Decklink modules… Will it compile with mingW? I know blender compiling was broken on mingW for some time… (don’t actually know if it’s working again…)

By the way… doing the passes by hiding all objects and rendering to texture avoided having an extra scene, and I actually managed to get it working, altough I had to fiddle with the delay to pass the rendered texture to the 2DFilter… Thanks all for your help!

You need to use vs2013 to compile. Ive made a new build and selected Decklink from the cmake, link-> https://drive.google.com/open?id=0B6yNvVGWiWysYmlYMS1iSzdoQTg You could also test Unlimited planar reflections. :slight_smile:

Thanks Akira… I’m downloading right now… I also downloaded the latest release to see if Decklink module was included, and I confirm that it now is… in fact…
Before testing the unlimited planar reflections I’ll have to read some documentation first, but as soon as I do, I’ll post on my tests.

Once again, thank you for the information!

You are welcome. :slight_smile:
For planar reflections, you select it, the same way as in the cubemap. Only you choose plane instead of cube.
I see there is a bug fix(UPBGE: Fix Delink in multithreade call to UpdateWorldDataThread.) - https://github.com/UPBGE/blender/commit/186d3657deda73b7fb271bcbec7cb0892ece2aea
Some of the interesting commits of 015 are: UPBGE: Implement infinite planar maps and refactor cube maps. info(https://github.com/UPBGE/blender/commit/acdbcc2c091187024b533581f94ca1e1541a539b)
UPBGE: Implement custom filter off screen. - (https://github.com/UPBGE/blender/commit/265daf9b6e6ba8a0c76b6f56d2d666fa67b22106)

I’ve been trying this latest upbge versions, and Planar reflections look awsome… Makes me want it even more… but I still struggle with some needed features… maybe someone here knows any workaround?
In blender official, you can start the gameengine with “-a” option to get full 32bit color depth while rendering to texture… thus allowing render to texture to have 0 alpha where you have no objects… is there a way in upbge to set the world to (0, 0, 0, 0) color?
The “-a” option in upbge just crashes it before even starting…
this all thread started because I was doing an “alpha over” pass…therefore I’ll have to find some way to try to achieve the same result…
Thanks

@nestanqueiro: Normally you don’t need this option for ImageRender or ImageMirror because they are using a off screen which support alpha in their textures. Anyway we will debug the crash with the -a option.

Bug fixed in a68564c1e0871e2f7ec51ecaf6c999c3c4e2bdc5.

@nestanqueiro: Hi, you can report upbge issues here: https://github.com/UPBGE/blender/issues

you can try upbge ImageRender API with hdr argument for 16 or 32 bits colors: https://pythonapi.upbge.org/bge.texture.html?highlight=imagerender#bge.texture.ImageRender

wow… that was fast… Thanks!!!

I’ve been trying to make it work with an offscreen framebuffer object, but I could not get it to work… when I bind the texture, some other texture just goes black… usually the bgl_RenderedTexture… or the video i’m inputting from the Decklink card…

I’ll keep trying with upbge, but some of my shaders are also failing in this version… (surelly my fault, but I think upbge is a little bit more strict interpreting glsl than official blender)
I’ll have to clean them up and get them working before trying the new (to me) bind code used in upbge…

I’ll have to setup a build environment to test this, but I think as soon as I have this feature working, or manage to get the framebuffer object working, I’ll surely move to upbge!

New build, test -> https://drive.google.com/open?id=0B6yNvVGWiWysSjlUVUd0dDY0d1U

@youle: Will try as soon as possible… If I find any issues, I’ll post them at github… after making sure it’s not something really dumb i’m doing… I’m going to take some time to read some documentation to get more comfortable with the upbge API
Thanks