future request > game render to other render engine

Well i’ve been blendering for a short while now.
I like the game engine a lot, and i tend to use it more like a real physics engine. (or rather near physics)

It would be nice if blender game render could be rendered also with a image render engine and to make a movie.
I know there is an option to record and then render, but this doesnt always work.
It doesnt work for scripted objects or objects with logic in them.
So i like the game renderer to provide the whole scene for another render engine. (i’m not talking about recording).

It doesnt need to happen real time, but with real render options that would be so nice.
As basically all i see in game renderer is there
the software knows where to place stuff how it looks like, but it just wont use it for render mode.

I will include a file i posted elsewhere too in the forum.
Just to show that simple game engine render will not show up in other render engines.
PS as i know that rendering is a bit different i dont ask for real time rendering here,
If a video of 2 minutes takes 6 hours to render then thats ok to me.

I would like to use Blender for future presentations of mechanical ideas, so a good look would be nice.
here is my sample file https://dl.dropbox.com/u/54767531/NEWfactory.blend (updated once in while)
And this is how it would look using game renderer; the treads wont render in the other render engines
And the milk packs wont render either.
https://dl.dropbox.com/u/54767531/newfactory.jpg

I don’t really understand what you actually need here, and why. If you want to record an animation , there are blender-agnostic tools for that (such as FRAPS). If you want to play back the animation without blender, there’s the blenderplayer.
EDIT: Do you mean you’d like to render it with any blender-compatible renderer? I don’t think that’s going to happen. The game engine data format is different from the blender format and so far it is only a one-way conversion. If you want it to be both ways, you’ll have to develop that yourself, I’m afraid. Depending on what you need, it might be enough to just record the transformation matrices every frame.

He wants to play in game engine and record everything that goes on (player controls for instance) while doing that and then backport these in the renderer and render as animation.

oh that is great, i am really new to blender; but i already knew Python and some other 3D environments from the past.
I am just using it 3 days or so, how actively is it updated ?

In reply to Zalamander
well what i do in the example, the treadmill band is made out of parts that move by game object logic, and (as today) include python code. So the game engine does above a terrific job, but i know from the other engines they can render much better. So i was thinking how about sending it on a per frame base to another render engine, make the picture, store it as a pgn, then go the the next blender game frame.

I was not thinking of keyboard commands because i know this would be a slow process, but it is nice for a real looking physics simulation.

Looking at your file, I think the simplest way to go about it would be in every frame to just save a list of the objects that exist. (and their relevant properties, such as transformation)
Then, have another (blender) script recreate the scene from that list. You can batch render from the commandline and you can also run a script on startup.
The game engine and blender APIs are somewhat different, so you’d need to learn a bit of both.

Martinsh usually attach the .blend files along with the script in his thread. You can download and try it, so basically it’s all already usable. You just need to record your game.
It’s just that his script was all done in python. His project with Mokazon is to implement those script inside blender (with C++ i guess). As far as i can remember, subsurface scatering also already usable. I’ve seen someone post some images using that feature in BGE. I know is not as perfect as rendered result. But it’s getting closer and closer.

Yes such a solution would work i believe or would if the other engines dont have a problem with script generated objects ?
It reminds me a lot of a program i once invented inside second life, there i had an object listening to other objects.
The other objects named their position and rotation; it could store them and later replay them.
And so one could create bildings as temps, and go beyond second life limits.
It became quite popular, and after a while other scripters sold it even on SL exchange… oh well i created just for fun there.

However for Blender my scripting level in not as good as once in second life at the moment.
Have people maybe already created such scripts ?.
In Second life they became popular and free after a while.
(as i shared my code free there)

You’d just create blender objects. There would be no difference between them and user-generated objects. Once you have your data in blender, you can use any blender-integrated rendering engine (such as Cycles, Blender Internal, Yafray, Luxrender, Mitsuba …)