Hi. I’m running Blender on Linux with X server. I’m trying to get a screenshot of an application running and I want to use it as a texture (through the VideoTexture module) in BGE. I have a crude method of using imagemagick to capture a screenshot to a file and then reading the file, but is there a way to do this directly, to cut out the middle man, so to speak? Could I get the data directly from the X server somehow? Is there a lower-level method of doing this, such as getting it from the buffer or memory or GPU or using openGL??
I’m doing this every frame, grabbing the window image, so writing to a file is inefficient. I know there is a better way to do this!
I found this:
To extract a rendered image from a rendering server:
Extract or retrieve the image to be displayed from the rendering server. Typically, an application would do an XGetImage for X applications or a glReadPixels for OpenGL applications. The application decides the frequency and type of actions that will result in the extraction of the image from the rendering server. One criteria might be each time the buffer is swapped using the XdbeSwapBuffers or glXSwapBuffer subroutine.
I could use something like pyopengl.