Question

Is it possible to make a multiple camera thing where one cam is looking at the player and the other is looking at the player from a distince. Like race games. And when the other cam from a distince is runnung it sends the images it sees to a screen in the game. Like Nascar where there are cameras all around to view the cars but still have the camera behind the player as the main screen.http://tk.files.storage.msn.com/x1pNWjjkHJ3o_x3WubfHjeF3ErWLNdue_kYuheAE-6n7Sgq5sahfDVuShbFLW_t2tNk56tk9pn2t0lAL-VTP_f7LJ_v3ksHCjWvtFO7yOEfw51IciM0HegDnKo_S1PVBUVXHQ9Lzx4aI78
If this function is possible, can somone tell me.

You can try this:

Split Screen and multi-viewports

Splitscreen effects are now possible by allowing Multiple Viewports. To use them, you need to attach a python script to a second or third (or numerous) cameras:


from Rasterizer import *
own = GameLogic.getCurrentController().getOwner()
own.enableViewport(1)
own.setViewport(0, getWindowHeight()/2, getWindowWidth()/2, getWindowHeight())

You can also have views that are embedded over the top of another view. See the example blend for code (note that the example is incomplete since the embeded window needs to be drawn last…).

i think what you want is something like a giant television in the scene that watches and displays your character as it moves about the screen. is that right?

That’s right. I want a big prejector or whatever placed in the Scene that shows images from another camera that is like a helicopter view or somthin. but still ceep the main camera that views the player full screen. So the only time you view the giant tv is when you drive/walk/fly past it.

That’s known as rendering to a texture, which isn’t possible in the engine afaik. It may be possible by programming a GLSL shader, but I’m not very knowledgable in that area :frowning:

ST150

That’s known as rendering to a texture,

It’s NOT known as rendering to a texture…which is actually either rendering a procederal texture to bitmap or baking light/shadow/Global illumination passes etc to a texture. It is also used for rendering multiple maps that have been comped togethor eg… a graffti tag with aplha over a brick wall into one pic with them combined.

He wants a more than one realtime camera embedded over the scene…

Shn275

You can also have views that are embedded over the top of another view. See the example blend for code (note that the example is incomplete since the embeded window needs to be drawn last…).

altho I didn’t see a link to a file… do you mean just use the code? Or did you forget to put in link to the file?:confused:

Either way could you post a working file… I’ve been using the demo from Erwin for split screen but don’t know how to do the embeded/overlay screen?

For the gameengine render to texture is producing a texture with the view from a camera to be displayed as a texture of a mesh (not just a viewport).

Baking is usually done offline and already present when you start the engine. Multiple texture is multi texturing. No change in the textures -> no render to texture.

To answer the question:
-render to texture is not supported in the current version
-multiple viewports are supported

kirado: The example blend ST150 was referring to is available for download at blender3d.org on the main Blender download page. Scroll down below the 2.41 downloads to the section called “Regression Files.” One of the listings there is “Blender 2.41 Game Engine Demos.” The multiple veiwport demo is in that download.

We’re talking about an animated texture rendered from another camera within the scene, and updated in realtime. To avoid confusion we’re not talking about having multiple viewports/picture-in-picture/splitscreen here. These can all be done with ease.

Rendering to a texture is not just shadow or lightmap passes. It could be the entire scene, which is what’s required here. I’m sure someone could write a shader to do it.

Blendenzo
yeah I’ve seen that demo… was curious about how to overlay the multiple viewports… eg. thumbnail viewport over main camera viewport! I assume to do that you have to add in screen co-ordinates somewhere in the script… not sure where?

ST150
the term -rendering to texture- in most 3d apps usually refers to what I mentioned… you mean realtime rendering of a camera to the UV co-ordinatees of a objects face… yes I understand exactly what you talking about, yes you can do it with GLSL… my point is calling it “rendering to texture” when that term refers to something else is gonna confuse people who are trying to learn that’s all I meant.

Well. Most of the time I don’t know what you guys are talkin about. one reason is that I’m just 13. So if it’s that hard to do, then I’ll just wait tell I’m a little more into the big games. Dont worry too hard about it. I thought it was simpler than that.