renderer for film

Plumiferos was one of the first feature animation to use blender. i have several question:

a) which renderer best used for rendering feature animation? internal renderer or luxrender or there is some other render? free and open source, please.

b) is there a real time renderer of blender, ala furry ball for maya and studiogpu?

thanks.

for A i would recommend blender internal, if you are rendering a still image, use lux, but for an animation blender will save you hours. if it takes 10 hours to render a frame in lux, and 10 seconds uses 250 frames, it could take you years. you could however, look into smallluxgpu or whatever it is :stuck_out_tongue:
this can render at some bizzare speed, but i could never get it to work.
not sure about B.

A- Blender Internal. However, there are short films made using Yafaray too. I forgot, there was this one of a mannequin. Probably a search in BN would reveal it.

B- Usable ones? As of now, no.

Hmmm… so two people suggested blender internal. Does anyone here follow the making of plumiferos and/or sintel to know the renderer they use?

Sintel used the blender internal renderer, though using a version of blender with the render development branch (the version of blender used comes with the sintel files)
Plumeferos used the blender internal for rendering.

There’s also Yafaray, but the 2.5 plugin for it is still very alpha.

SmallLuxGPU is very nice and renders fast (at interactive speeds), but it’s more of a “proof-of-concept” type software with limited functionality.

hmm… forgot about yafaray - i cant get it to work - always crashes :stuck_out_tongue:

v-ray would be also a good option, the developer of the plugin is updating quite often.

Yes, it crashed at times, but Yafaray is the tool I use for virtually every render needing GI.

I always wanted Vray for Blender ever since I used it on Max. Its a very powerful renderer- pretty much the best one next to Indigo if you can shell out the cash. Don’t know about animations though.

Hi,

Are you rendering an animated cornell box or something a bit more complex like, let’s say Avatar? When choosing the renderer that suits your needs it really comes down to the complexity of the scenes and type of lighting used. Before making the decision one might want to specify what features are required.

  • Style: photorealism, ‘typical cgi-look’ or npr perhaps?
  • Are there characters or other objects that would really benefit from the use of subsurface scattering material?
  • What kind of lighting situation? Outdoor or indoor scenes? Is GI required?
  • Camera? Is there heavy use of depth of field or motion blur that needs to be rendered insted of doing in post.
  • What can be done/faked in post-production anyway? Still backgrounds etc.
  • Particles and smoke, are those playing a big role?

Just some random examples to begin with.

After seeing the trailer of Plumiferos I’d say that you had some challenging stuff to render and the results are ‘ok’. Above average, great in some context, not really my cup of tea. But overall, pretty good. :wink:

As for Blender compatiable production ready renderers go (whatever it means), I’ve only used Vray with 3ds Max and would say that it would be a good choice even though to get most out of it one really needs to study the software. Another option would be Thea Render that is quite new but has pretty fast biased engine (alongside with unbiased solutions) and a working Blender 2.5 exporter.

I really can’t say much about free alternatives. Yafaray seems powerful but haven’t used it. Mitsuba is really promising but perhaps not really usable at its current state. Of course, Luxrender. If you have a couple of truly powerful machines you can consider it as an option.

And then there is BI.

How about rendermanish renderers, like pixie or aqsis

http://www.aqsis.org/
http://www.renderpixie.com/

I have no idea how stable they are or if they even work, but there is an aqsis plugin for blender afaik, and the results look quite nice, haven’t tried them though.

and even though it’s not open source, octane seems to be really cool aswell if you have a nvidia graphics card.

There’s also the possibility of mixing rendering engines, something which is often over looked.

For example let’s say you use the Vray plug-in to render a character with all it’s passes (Diffuse, normal, specular etcetera) with no fur, then you use the internal to render a fur pass, and then using the multiple passes you get out of the engine to composite the two images together, that’s just a simple example, but you can do it for a multitude of things, ranging from special effects (Smoke, particles), to background, and foreground passes in different engines, as long as the final composite looks good then that is really all that matters, and if you have a powerful compositing package like Nuke then you really should have very little issues in combining multiple outputs from different engines. Of course you should be fine with the Blenders compositing as well, but I’d really only use that for combining passes which would then be composited in a scene in Nuke, or AfterEffects.

Also, I believe Pixie isn’t in active development, Aqsis is, but is going a core re-write, which if anyone hasn’t seen introduced real-time REYES rendering, which I hear is a very big thing, as for the exporter for 2.5x, well I’m not sure what the current development status is, I know there is a version for download, but it’s still in the Alpha stage, although from what I’ve read up on it, and seen, when it’s complete it will be the most extensive and feature rich open source RIB pipeline/ exporter in existence, so that is certainly something to look out for, espeically when you can combine it with 3Delight (which is an awesome engine!).

i think that, in big films, they debate this a lot, what engine to use for this and that etc., the only guidelines are processing power, style and deadline. if you want photorealism but have a short deadline and little processing power, you need to compromise. also, they will adapt a rendering engine to suit their needs, like renderman, they might edit it so it goes a bit faster but the GI is not so good or whatever.

Blender internal is quite capable and tends to be faster than true raytracers such as YafaRay for a similar level of quality.
As for real-time preview, you can come fairly close if your computer supports GLSL.