How do I get my 3D animations as close to real-time rendering as possible?

The normal argument I see for why normal rendering is so much slower than real time rendering is that normal rendering is much more detailed. While that’s a worthy trade off for big budget projects, I’m still learning all this stuff. I just need to be able to render something without having to pay bookoo bucks for a render farm. Now-a-days, video games look really good already. Look at Uncharted 4, that game is gorgeous, and it all renders in real-time, not 4 hours a frame.

As I am just a hobbyist at this point in time, I’m wondering: How do video games get such good graphics, while keeping the render time so insanely low? How would I go about making animation like that and get a similarly quick render time, or something close?

I don’t need hyper-photo-realism, but I’m thinking how do I get video game quality, at that render speed, for non-interactive content? How would the models need to be made, the lighting, the physics, settings, etc? If it’s possible in an unpredictable game world, it must be possible in a preset animation.

Well, I’ve managed to use various tricks to coax a lot of mileage out of “OpenGL” renders of things. (I wish that Blender had more-explicit support for OpenGL as a render-engine than it does now.)

These days, OpenGL can (IMHO …) “go 90%-or-more of the way” to the place where you actually need to go. You can then use compositing. You use the various software-renderers … Cycles, BI … to produce details, which are overlaid onto the basic OpenGL-produced renders.

It is possible to model and animate an entire scene in Blender and export it to Unity, UE4, or Lumberyard, create the materials, set up post processing, set the animations to play at startup, and record your screen. Here’s an example:

It looks great and render in real-time. No baking necessary.

You might possibly be able to do it with this branch: Hopefully, it will get merged with the main branch.

Could you go into more detail on these various tricks you use?

I haven’t done them myself, but with Unity you can import .blend files directly use those meshes and animations. Then you have to create materials to replace the ones you used in Blender, since Unity does import the actual material properties if it’s a cycle material. Here’ s how. Then you can add some post processing effects. Here’s how. Then, you use a screen recorder like Fraps or OBS to record the animation. Another way (this should work better, but might be harder to set up) is to write script that tells Unity to go to the next frame, take a screenshot at a defined resolution, then repeat.

Stop me if I’m wrong, but real-time renders need baking. If lights doesn’t move, you can easily bake all textures on your scene. See this :

Then, if something is moving in the scene, like a character, you just have to add his shadow on the scene. The goal is to give your render engine the less possible work to do.

Baking everything isn’t an absolute necessity and it definitely isn’t recommend in a interactive experience:

If your just rendering an animation in a real-time engine, spending time baking would be a waste. Most stuff can be cheated with shaders, like reflections. Depending on your shader’s optimization, moving lights with realistic shadows is not a big deal at all. Just play with the variance shadows in BGE.

I’m a little late, but thanks everyone for the tips. I’ll make sure to test them out to find the best approach. I’m gonna use Unreal 4, because I’ve seen some shorts already made with it. I just need to figure out how to do it myself.