Resumable renders, faster progressive refine, and more.

I’ve come up with a way to solve a few minor annoyances I’ve had using Blender & Cycles, namely:

  1. Cycle’s “progressive refine” mode takes about 40% longer for the same image quality.
  2. It isn’t possible to pause a render and resume it at a later time.
  3. I can do CPU rendering, or GPU rendering, but not both at the same time.

Anyway, the solution to all three problems is the same process. …and it’s ugly, I’ll warn you now, but it does work. I’ll also warn you that the script I use is a total hack, and so you may just end up hating me for getting your hopes up, but maybe this post will inspire someone with better coding skills to provide something better.

The whole reason I use progressive refine is because I don’t know how long I’ll want to render. I know it will be a long time – if it wasn’t, it wouldn’t bother me – but I don’t know how long, and if I simply guess how many samples I want, then use non-progressive rendering, I’m potentially wasting hours of rendering time since that render will end up going into the recycle bin. As such, it isn’t so much that non-progressive mode is bad as it is that I can’t look at the result of a non-progressive render and say “OK, that’s good, but keep going…” It either turns out perfect or it turns out to be a waste of time.

So what I’ve created is a way to save multiple renders and merge them with an external program. In this way, I can just set Blender in animation mode so that it is spitting out a new render of the same scene every 10 or 15 minutes, then merge that frame with all of the previous ones with the external program, and thus combine all of my samples into a single image. Then I can look at it and decide if I need to continue rendering or not. If I need to stop the render so that I can use my computer, I can, and then later I can start it back up at the frame number where I stopped it.

Using the OpenEXR format, each color channel is saved in a 32-bit float, preserving all of the information Cycles has about which pixels are way over-saturated, allowing them to be merged with under-saturated pixels in other images to average out to the correct color value. Indeed, because there is essentially no loss in the 32-bit float format, what you end up with is the same as what you would have had if you’d rendered all of the samples in one go.

So, here’s my process:

  1. In the output pane, choose the OpenEXR file format, choosing RGBA output, full floats, and “none” for the compression type. I’d also avoid checking the z-buffer or preview check boxes. The script I wrote only knows how to read this one specific format, so if you save in the default half-floats format, or with any type of compression, the script won’t be able to read the files. Indeed, it might make sense to do a quick test render at first just to make sure everything is set up properly before you commit to more rendering time.

  2. In the samples pane, choose somewhere between 100 and 1000 samples. I like to aim for something that will cause the render to take about 15 minutes, since I’ll usually let renders run overnight and waking up to 32 files seems like a good number. My script is kind of slow to process files, especially if they’re incredibly high-resolution, and so having too many just makes me have to wait more. Also, 15 minutes is far more often than I’ll care to stop and look at the render and make the decision about whether to continue or not anyway.

  3. Using the timeline, set up key frames for the “seed” value in the “sampling” pane for cycles. Just set up frame 1 for a seed of 1 and frame 1000 for a seed of 1000. In this way, each rendered frame will be different, and so averaging them together will result in a better image. If you don’t do this step, then each rendered frame will have identical noise and so averaging them together will be pointless.

  4. In the output pane, choose a file name that is in the form “word-number-” such as “/some/folder/sweet-1000-” which will then cause Blender to append the frame number after the hyphen at the end of the name. The word can be anything other than “average” and “final” which the script treats as special, but the number should be the number of samples you’ve configured to be rendered into each frame. The script will read this number out of the file name, which allows you to later decide that you want a different number of samples in each frame and the script will still be able to merge all of the images with the correct weight. Also check the “file extensions” box so that you get extensions on your files.

  5. Start rendering by clicking the “animation” button. If you’re continuing a previous render, it’s especially important to remember to update the start frame number on the timeline first. I always make sure to set it when I stop rendering and immediately save the file so that I don’t forget when I start rendering again.

5.1 Optionally, start up a second copy of Blender and have one configured to do GPU rendering and the other configured to do CPU rendering. Just make sure they aren’t rendering the same frame numbers, e.g. start one at frame 1 and the other at frame 501. Also I recommend using one fewer CPU thread than usual on the CPU rendering, since the GPU rendering will require some CPU and you probably don’t want to slow it down. (For me, GPU rendering is about 3x faster than CPU rendering, fast enough that I want it to have priority, but not so fast that I don’t still get a significant gain by doing CPU rendering simultaneously.)

5.2 Optionally, start up more copies of Blender on other computers. Again, you must make sure that they render different frame numbers from the other computers, since if you render frames with identical seed values, you end up with identical frames, which is pointless.

  1. When you want to see what your image looks like so far, get all of your output frames into a single folder, then from within that folder, run my script twice, like this:

./Merge_EXR_Frames.pl sweet-.exr
./Merge_EXR_Frames.pl average-
.exr

The first invocation will merge all of the sweet-.exr files into a single image named average-1000-0001.exr, with the “1000” replaced with the total number of samples in that file, and the “0001” will likely be “0001” unless it detects that it needs to change it to avoid overwriting a file, which won’t happen often since usually each time you’ll end up with a different number of samples and so the file name will be unique. It also moves all of the sweet-.exr files into a subdirectory named “originals” so that they will be ignored by future invocations of the script, but are still there in case you need them later. This is because they’ve all been merged into the average-*.exr file and thus they aren’t needed anymore.

The second invocation combines all of the average-*.exr files into a final-1000-0001.exr file (with the numbers serving the same purpose) and also uses imagemagick’s “convert” command to make a PNG file out of the EXR file so that you can view it in some non-exotic graphics program.

The purpose of the intermediate average-.exr files is that the script takes a moment to read each file, particularly if they’re incredibly high-resolution (which they might be if you commonly spend days rendering an image). If it can merge 100 files into 1 file, then it runs 100 times faster the next time it reads those samples. If you want, you can re-merge them by simply deleting all of the average-.exr files, then restoring the original files from the “originals” subfolder and running the commands again. In particular, you’ll need to do this if you decide to change the “exposure_adjustment” or “clamp” variables at the top of the script, since they’re applied as each original image is read.

Finally, it turns out there are a few other benefits of this method that are worth mentioning:

  1. You can adjust the exposure setting after the render is complete. Since the raw 32-bit floating point values are stored in the files, the adjustment is 100% of the quality you’d get if you had simply rendered with the desired exposure to begin with, whereas adjusting the brightness in an 8-bits-per-channel image will leave holes in your histogram. I find this especially useful since with very noisy renders it is hard to tell initially exactly what the final brightness is going to be without rendering thousands of samples.

  2. You can kind of choose a clamping value after the render is complete. I say “kind of” because what you’re actually clamping isn’t the value of any individual sample, but of an average of 100 or 1000 samples (or how ever many you put in each frame). Thus, clamping to 1 (the minimum value you can use) when each frame contains 100 samples is in reality clamping to 100. However, it’s better than nothing, and I have seen it remove annoying bright pixels from a 100k sample render without me having to re-render the image, so it’s definitely useful.

  3. You can render on multiple computers at once. I’ve never done this, and probably never will, but it’s possible.

Anyway, here’s a pastebin of the script:

http://pastebin.com/QhcbtzH6

Given that it uses no libraries, it should run under Strawberry Perl in Windows without much difficulty, and so I think the only place Windows users will run into trouble is the execution of ImageMagick at the end to convert the EXR file to a PNG file, and that’s easily solved just by finding any program capable of viewing EXR files, rendering the failure to generate the PNG file irrelevant. That said, I certainly haven’t tried to run it in Windows, so who knows what will happen.

In Linux, imagemagick interestingly doesn’t come with EXR support by default, but it can be installed with an apt-get:

apt-get install imagemagick libmagickcore5-extra

Anyway, enjoy the script, and by all means feel free to share any improvements you make with everyone.

1 Like