Access the rendered image from a material shader

I would like to be a able to access the bgl_RenderedTexture and the bgl_DepthTexture of the previous frame from a GLSL material pixel shader. How can this be done?

Thanks in advance

I once wanted the same, ended up with failure. If you want the image of previous frame, however, there is a way to do it with rendertextures. You should render a rendertexture from camera perspective, store it in a texture and load that in shader.

But beware that you can’t access the texture of current frame this way. It will always have 1 frame lag. On fast movements that can be very serious problem. So, depending on the use, this may not be your optimal choice.
Also, note that this would mean that you have to render the scene twice. Not quite efficient (but it’s kind of the same what happens in 2D filters, just in filters the actual render (the one drawn on screen) is just a shader-processed quad).

If you want to know how to make rendertextures, try looking here:
https://pythonapi.upbge.org/bge.texture.html#bge.texture.ImageRender

When scrolling through documentation I also found this:
https://pythonapi.upbge.org/bge.texture.html#bge.texture.ImageViewport

I have no idea what it is, never tried it, but it could be worth it for you to check out, as it seems to possibly be an easier way to ger the rendered texture of the scene for shader.

Note that there are also the depth/zbuff options aviable for rendered textures, so you can access both - rendered texture and it’s depth texture in a shader.

I hope this helps.
Greetings,
Adrians - the penguin

Oh, wow. That looks like just what i need. Thank you!

I know very little python (I’m more of a C++ person) so, if it is not asking too much, getting a code example from someone would be much appreciated.

Again, thank you. And thanks in advance to whoever helps me this time!

it’s python

video texture module

https://pythonapi.upbge.org/bge.texture.html#video-texture-bge-texture
I use render to texture for my minimap

you can probably render to texture and save the image as a property and grab it next frame.

Well, you just have to render the image, store it somewhere in object(like, for example, in property) and use a uniform sampler to pull it up? I don’t remember, to be honest. Let me check some of my old .blends first to see how I did it, then I’ll share it with you :wink:

I noticed those links take me to the upbge documentation. Do those functions work on regular old bge?

Most of them do, but they have different functionality(e.g. no HDR) and some bugs were fixed by UPBGE that made them terrible to use (e.g. mipmap generation was not working, in some cases the textures couldn’t be accessed in shaders etc.). I highly suggest using UPBGE for this (and nearly any game in Blender) due to it’s up-to-date bugfixes and some neat features.

“Vanilla” BGE page: https://docs.blender.org/api/blender_python_api_current/bge.texture.html