Capturing Camera-view Pixels - "Multitexture Shading" Dims Textures

Greetings!

I’ve recently started working with the Game Engine as a means for prototyping & demoing GUIs on a small USB-attached color display. I’ve written a python script that upon starting it procures a section of shared memory for frames of pixels to be placed, after which the script then streams those frames to the USB-attached display.
Inside my .blend file I’ve written another script that repeatedly captures the view of an Inactive Camera, and returns a list of pixel values. Those values are then placed in the shared memory that the non-Blender program procured, and they are then forwarded to the USB display.

While I’ve managed to get everything running, now that I’m starting to include file/image textures in what is displayed “in game”, the shadeless image textures are appearing “dim”/“dark” in comparison to how they should look. They look fine in the viewfinder, and when I run the BGE with the GLSL shading - but not with the Multitexture shading. Additionally I’ve streamed the textures directly to the screen with my non-blender python script, and they too look as bright as they should be. So somewhere between importing the textures into blender and when the pixels are dumped to the shared memory, something is dimming the textures.

As noted switching to GLSL shading appears to fix the color dimness issue. However my current Blender python code cannot capture the Inactive Camera’s view - details to follow shortly.

With that said, I’m looking for a solution that will allow me to in realtime capture a (not dimmed) Inactive Camera’s view so the pixel values can be shuttled out to the external program. From where I stand this could potentially be done through resolving the Multitexture Shading issue, or migrating to GLSL shading and updating my python code to work appropriately with the different Shading. But I don’t discount that there could be other ways as well.

I’m hoping this will be a simple problem for someone that’s been working with the BGE for longer than I have.
If you’ve read this far, thank you for taking the time to read through all this :slight_smile:

Python Code Snippets:

import bge
import numpy as np
from bge import texture

scene = bge.logic.getCurrentScene()
controller = bge.logic.getCurrentController()
own = controller.owner

12 if 'texReady' not in own:
13        cam = scene.objects['NOT ACTIVE CAMERA']
14        matID = bge.texture.materialID(own, 'MARTT')
15        renderToTexture = bge.texture.Texture(own, matID)
16        renderToTexture.source = bge.texture.ImageRender(scene,cam)
17        renderToTexture.source.capsize = (128, 128)
18        renderToTexture.source.flip = True
19        own["RenderToTexture"] = renderToTexture
20        own['texReady'] = True

then if the texture is established:


    own["RenderToTexture"].refresh(True)    
    imageData = np.asarray(texture.imageToArray(own["RenderToTexture"].source), dtype=np.uint8)

If I run this with Multitexture Shading, then it works dimmed, but otherwise without issue.

But if I switch to GLSL I get this:

Blender Game Engine Started
Python script error - object 'Plane', controller 'Python':
Traceback (most recent call last):
  File "InitAndTransfer.py", line 15, in <module>
RuntimeError: Texture material is not available
Blender Game Engine Finished

Sounds a bit like post processed gamma-correction. Means the final image is adjusted after render before displaying.

But this is just out of my brain. I do not have details on that.

Btw. I suggest to use [noparse]


[/noparse] tags to post code snippets.

Hey there Monster,

So are you thinking that there’s gamma-correction going on inside of Blender when it’s processing the textures?

And thanks for the tip about the “code” formatting. I looked for a button for it, but didn’t see it. I’ve since updated the original post’s formatting.

I agree with Monster, it’s most likely a gamma issue.

Speculation:
Multitexture is fixed-function, and the GPU will (proably) automatically apply gamma correction. As a result, the image that BGE sends to the GPU is not gamma corrected.
GLSL allows a lot more customization, and so the gamma correction has to be done in software. So when the BGE sends an image to the GPU, it is gamma corrected.

(Side note “sends an image” is a poor description, but hopefully you get the idea).

So the solution is to do the gamma correction manually. Gamma correction is either raising or lowering to the power of 2.2. In this case I think you need to raise it. However, because you have uint8’s rather than floats, it’s a little more complex than just running:


np.power(imageData, 2.2)

However, hopefully now you know what the issue is, you should be able to correct for it somewhere. If all else fails, it shouldn’t be too hard to do in the blender source.


Regarding why it isn’t working in GLSL: it should. I did this sucessfully in my BGMC26 game. (Terrible gameplay, but quite cool from the technical side. It does light pressure calculations based on the normals of the object in view.) Here’s the code I used:


class RenderCamera:
    def __init__(self, cam, obj, res):
        self.cam = cam
        self.obj = obj
        self.tex = bge.texture.Texture(obj, 0)
        self.image = bge.texture.ImageRender(cam.scene, cam)
        if hasattr(bge.app, 'upbge_version'):
            self.image.horizon = [0.5 ** 2.2] * 4
            self.image.zenith = [0.5 ** 2.2] * 4
        else:
            self.image.background = [127] * 4
        self.image.capsize = res
        self.tex.source = self.image


        self.data = bytearray(res[0] * res[1] * 4)


    def update(self):
        self.tex.refresh(True)


    def refresh_buffer(self):
        '''Returns a 2D array of the pixels'''
        self.image.refresh(self.data, "RGBA")  # Similar to the ImageToArray method, but refreshes directly into an existing buffer, avoiding extra memory allocations.
        return self.data
        

Run it with something like:


def get_data(cont):
    if 'CAMERA' not in cont.owner:
        cont.owner['CAMERA'] = RenderCamera(<camera_object>, cont.owner, [128, 128])
    else:
        
        # cont.owner['CAMERA'].update()  # Does the draw so that it is visible in-game. Not necessary unless debugging what the camera is seeing
        cont.owner['CAMERA'].refresh_buffer()  # Does the draw into a buffer inside the RenderCamera object

        data = cont.owner['CAMERA'].data  # Always contains most recent result of the refresh_buffer call.

The data is already uint8-formatted, and in the game, I just passed it straight into a C function though pythons Ctypes interface. Because arrays in C are just pointers at the first element, there is very low data communication overhead in doing so. I suggest looking at the source for the game linked above if you are stuck.


Note that any method of extracting the image into data is rather slow, due to GPU->CPU bottlenecks so if you can, try outputting to the screen directly.

Hey there sdfgeoff,

Thanks for the detailed information! I hadn’t realized it’d be relatively simple to compensate for the brightness being off. And thank you for the GLSL code snippets. If I were developing such a game, I don’t think I would have immediately thought to use the graphics engine to calculate the received light - I probably would have just tried to hand-code something to emulate the same effect, and have over complicated it.

About the GPU-CPU bottleneck, in my particular situation I need to export the pixel data to another program. Are you saying that there’s a more efficient way to do this by first rendering to the screen, and then capturing the pixels that way?

I just tested my updated code with the gamma correction factor in play, and it definitely brightened up the images. Thanks Monster & sdfgeoff!

It does look like the darker pixels have also been boosted away from black and more towards grey, but now that I know what’s going on I can work to accommodate for that.

Also, for the record, I had to raise the pixel values to the power of (1/2.2) to brighten the brighter values. Gotta love sub-1.0 values which get smaller the higher the power they get raised to.

Aaaaaand after realizing I needed to create a texture for the script to access, I’m successfully using GLSL for rendering the views that I export. It appears that it sidestepped the Gamma issue, too!

Thanks for the help sdfgeoff & Monster :smiley:

Now it’s time for me to start digging into how to construct animated BGE menus