Question about a camera thing

I was wandering if an image seen through a camera can be projected onto a plane mesh?

sorry about the subject I didn’t really know what to put. :-?

do you mean, mapping the rendered output (what you see through the camera) on a plane somewhere in the scene? if yes… let’s be a bit crazy :o and assume we have very fast machines, especially HDD’s:

via python, you can write each rendered frame to disk. again via python you can change the texture of you plane with that newly rendered and saved image. drawback is that your image on the plane will lag at least one frame behind.

however there are better ways to do this in opengl (without going through the hard drive), you would be best off either:
[list=a]- changing the blender source (though a new feature like this will quickly become overused and we will get questions as to why it is so slow)

  • dealing without it
  • not using blender
    [/list]

Hang on if I interpret you right Falconis, you’re talking about rendering ONE frame and pasting that image as seen from the camera onto a plane right? All you do is render the image, (or just take a screenshot if you want) then select the plane, hit f, hit u then select ‘from window’ then load the rendered image in to use as a texture.

Hope that helps

Keith. 8)

I actually was refering to making it in realtime

Oh you mean like turning a plane into a TV, which would display the image from a camera? I’m afraid thats not possible in blender yet, its just yet another cool feature that would be nice to implement in the future. :frowning:

Keith. 8)

I know opengl can render to texture. And basically you can have that texture on any kind of mesh (plane sphere etc)…

Don’t ask me how, I just saw a tutorial on it somewhere.

I also remember a long time ago while looking at something in blender I saw a feature where you can play a movie like an avi file on a plane. Maybe it somehow relates to this where you can just take a snapshot of a camera and then put it on the plane.

It is not possible to display an avi animation on a plane. OpenGL might support rendering to a texture, but this feature is not implemented in Blender, yet.

Keith. 8)

Blender does have the feature to use an AVI as an animated texture, but only in the rendering engine, not in realtime.
NeHe has a tutorial on using AVI files as animated textures in realtime http://nehe.gamedev.net

AFAIK render to texture requires an OpenGL extension, but you can render to the backbuffer and then copy the pixels into a texture. (glCopyTexSubImage2D())
This is how it’s done in the “RenderToTexture” tutorial on http://www.gametutorials.com

Unfortunately there’s no way to access advanced OpenGL commands from inside Blender’s game engine, so either of these effects would require modifying Blender’s source code :frowning:

My post above is in error. I’ve since found out that in Blender 2.23 you can access the OpenGL functions neccessary to render to a texture, using the BGL module. (Unfortunately, the BGL module has been crippled in the game engine for Blender 2.24 and up.)

However, since Blender doesn’t render the scene until after the scripts have run, I don’t think it’s possible to capture blender’s actual rendering output. You’d have to render whatever you wanted in the texture yourself with Python, which would probably be very slow.

I don’t know exactly what is needed but you might try just making a duplicate scene in the bg and covering it up everywhere except one place or look at splitscreen scripts from this forum

Ok, after a little experimenting, I’ve found that Blender does gamelogic for each active scene after the previous scene has been rendered.

So it would be possible to put a script in a background scene that captures the rendered image from the main scene, and saves it as a texture that can be used by an object to display constantly updated image of the scene. (You’ld also need to resize the image you capture from the window so it fits in the texture.)

But thats still not possible in the current version of the game engine, as textures are only loaded at the start of the scene and you can’t gain direct access to the OpenGL functions to amend the texture.

Keith. 8)

Actually, most of BGL does work in the game engine of 2.25, only drawing to the screen is disabled. (I suspect it’s because they clear the screen before rendering, but after doing game logic.) So the way I see it, there are three options.

  1. Go with an older version of Blender that’s more extendable with Python.

  2. Use the current version of the game engine as it is and live without some features.

  3. Get the Tuhopuu2 source code and see if you can add the features directly in the program, or just bring back drawing with BGL.

The 3rd option is definately best for the long term, but option 1 is the most convenient for short term projects or just for experimenting.

Is there a split screen script? I don’t suppose you could put a link to it, coud you?..

The split screen script was made by z3r0 d. AFAIK it was the first script to use BGL in the realtime engine, but it only works in Blender 2.23.

Here’s a link to the post, I found it using the forum search.
https://blenderartists.org/forum/viewtopic.php?t=3895&highlight=split+screen

why yes, there is one

https://blenderartists.org/forum/viewtopic.php?t=3895&highlight=split+screen

only works in 2.23 and not too well at that