Network rendering and compositing bug, seven years unresolved?

I was pretty excited to discover the network renderer last week, and even went so far as to quickly buy a second machine to help me out with my work. Imagine my frustration when I discovered the compositor apparently doesn’t work with the network renderer. :frowning: A little online research revealed that this is a well-known bug, but to my astonishment, its reporting goes back to 2010. After seven years, I have to conclude it actually isn’t a bug at all, but rather the the network renderer is simply not intended for the sort of work that would require such a feature, and that to expect it on my part reveals a misunderstanding of what the network render is for, right?

Thus my question: what is the network renderer for? It must be for some use case that I don’t understand. Is there some other aspect of distributed rendering that it serves perfectly well without this feature? I admit that I am probably using Blender in only one specific way, and I’m sure it can be used for lots of different kinds of projects (I’m quite new to the whole thing). So, what is the intended use of the network renderer? Does it have something to do with single frames (I am only rendering animations, since my project is an animation, and I know a lot of 3D artists focus on still art. Is that what I’m misunderstanding about this issue? Is the network renderer for still art and would it work perfectly well for that purpose (I admit I haven’t tried)?

That question aside (and I really am curious as to the correct use case for the network render), what is the current “official” recommended way to distribute animation rendering over a network? I would love to be able to do this, as it would vastly improve my work process if I could scale animation production horizontally.

Thanks.

It’s possible to launch blender via the command line. Using the correct command line switches one may specify a range of frames, an individual frame, or even subdivisions of a single frame to render. The documentation for brenda (a set of python scripts to connect to Amazon AWS), provide examples for each of these options. See it on github. I’ve tested brenda and it still works (as long as you have the AWS dependencies and a working copy of python 2.

Blender absolutely will scale jobs across a cluster using typical batch software. Can confirm.

Does your “specific way” require you to use anything in the compositor that can’t be saved to a file? Is there even such a thing? Don’t know since I don’t use the compositor that much. If not you could still render your animation with all needed passes and later on do the compositing on a single machine from your already rendered animation.

Well, the compositing seems to take nearly as long as the rendering, so if I could only distribute the rendering and then had to composite every frame on a single machine, such a practice would severely undermine the benefits of distributed processing. Thanks anyway.

Brenda: will look into it. Thanks. Since I first posted this last night, I discovered the “placeholders” option. I’ll test that and see if it gets me to my goal. These two approaches (Blenda and placeholders) would still require reconstruction of the animation from the frames after rendering/compositing them (as opposed to simply specifying Blender’s rendering output as a video file and getting the final result back in a single issued command). But that’s okay. If that is an effective way to build an animation, I’m okay with it.

A few points.

None of us here know what you’re trying to do. For example, are you compositing animation against motion tracked live shot footage? Have you broken up a final shot into multiple animations across layers in order to import in and composite footage from a different program? How do you plan to handle the sound and dialog mix? Etc etc etc

So, if you lay down audio tracks and footage in the NLE, that won’t be packed into a .blend file. But you can still reference those external files and act upon them via batch processing, as long as they remain in their relative location within the directory structure common to all nodes (via NAS) OR they are copied to each node beforehand (say, via a .zip or tar.xg file).

You’ll have to set up compositing nodes and configuration beforehand and save the .blend. This .blend file will have to be available to each node as well. Then each node executes a blender command with the correct switches and the correct environment variables set. If you’re at this point manually, you can start to figure out how to make your batch master launch the jobs across the cluster. That’s really not the most difficult part of the task.

Also, you might find the Morevna Project’s RenderChan set of python scripts of help. It integrates in with Blender very seamlessly. BUT, you’ll still have to dig into the nitty gritty to make it work. Especially if you want to use it with a batch system.

Also, forget brenda. Instead, go read the official blender documentation on command line switches. My account is too new to post links. Do a google search for:

blender wiki “command line switches”

And read the official docs at wiki.blender.org.

Okay. Thanks. In short, I’m creating an all-CGI short film. No real-world camera footage or motion-captured character generation or anything like that. Just models, particles, some mild mesh deformations, materials/textures, path animations, and final compositing for any visual effects that are not easily accomplished via materials (such as glows, blurs, etc.). The storyline is divided into acts of scenes of shots. Each shot is a separate blend file. I’ll produce a series of isolated animations, one per shot, and then combine them somehow (iMovie unless it proves inadequate to the eventual needs of the project…or perhaps I’ll combine the shot animations into a single giant blend file…I haven’t researched the final combining step yet).

Thanks.

I think you’ll find iMovie maddening. But Blender has a serviceable editor (NLE) built in. Otherwise, Premiere or Final Cut Pro.

I also think everything you mentioned can be automated by batch processing if you set it up right.