Your own virtual monitor INSIDE the game?

Okay, here’s the thing :
Would there be a way to display what is currently displayed on your monitor to a texture inside the game (and which moves and animated depending what is currently done on the monitor) ?

I’m on Ubuntu 14.04. I was thinking about something with the X configuration or something like an addon inside blender where you can take what is currently display in a windows and copy and past it inside Blender?

Thanks for any suggestion :slight_smile:

Edit: Something like that :

Mahalin did something along these lines already, might be worth contacting him or poking at his github

Getting the window (pixmap) as a texture is easy - the hard part is handling input. It’s what prevented Compiz from allowing the user to properly interact with the window while an animation was occuring. There were some experimental patches for X11 to allow input redirection but it never went mainstream.

Wayland has been able to address these issues I think:

As for my project (Flow UI), I have it working with X11, but I’m in the process of implementing a Wayland backend. I’m still learning the API so I can’t provide much information yet. I think you need to use EGL (Qt is one method) in lieu of GLX. I’ve been looking at Weston, the reference compositor, so that might be a good area to investigate.

This provides some insight (the title is not what you think):