Is it possible to use a camera as a source for a texture in cycles? I know this can be done using video textures in the bge, but what about in renders? Could this be done as part of a material’s node setup so that you would not have to pre-render and then load an image sequence? ie. Realtime in a viewport.
there are a few ways to do it, but the simpliest would be to use the ‘TextureCoordinates::Window’ as the vector of the texture.
That just gives a vector though. I mean using the output of a camera ie. the rendered image from that camera, as a texture in a material.
ohh… then the answer is no. you need to pre-render the image from the camera, and then use that render as a texture.