I have actually very old idea how to distribute big movie (really big, 100+gb textures per frame, 4k+ resolution, all that things). Idea is simple, torrent like net, and main “node” must assign some pieces of work (like tiles, but there is better solution i think). That was when i participate in few renderfarm.fi rendering few years ago, as manual handled queue obviously was bottleneck for many ppl.
I think that for now best money per frame solution is 2 passes. Same as 2 passes compression algorithm doing. First pass is collect draft data how many samples each frame need, and store that information in some file. Just frame number, and some noise metric.
Second and main pass is to offload actual work to all current nodes. For now i think that MCMC/MLT sampler can show best performance at that task. It can “resistribute” samples inside frames, motion blur sub-“frames”(there is no actually such in real MB, more like continuous time domain, but i think you got idea), and more important, between frames. we can fire MCMC/MLT in some package of frames, say 8 frames at a time, and offload to network nodes not square “tiles”, but initial MCMC seed. Double gain, as it will be more cache friendly and can redistribute samples, we get even noise level across all movie, not need to stare every frame and adjust samples manually, and more efficient cache utilisatuion.
We need to redistribute samples in any high dymamic scenes, like actor exiting dark cave to bright valley, or action fight wioth all subframe extreme light bursts all over the place.
Just Idea, no code exist except early experiments with MCMC sampler in Cycles.