šŸš€ Turbo Tools - Faster Cycles Renders & Compositing!

Actually, you donā€™t need to enable any passes manually. When you render, the required passes are automatically enabled before rendering, and then automatically revert back to your settings afterwards.

Iā€™m unable to replicate a situation where the automatic enabling of the necessary passes doesnā€™t work, so I think your resolution may have been something else. If you do notice this behaviour again, then if you could send me the scene to the support email Iā€™ll take a look :+1:

1 Like

Thanks Michael, I will work through some more test cases and see if I can get a file for you that best demonstrates my original problem.

1 Like

Great news, 3d world magazine have asked me to write a Turbo Render tutorial for their magazine :smiley:

4 Likes

I just finished a project, where I use various denoising technic, incl. motion channel based ones. I have to say, that it sometimes works perfect, but can also fail. I tried the Optix denoiser, but it was far worse then with classic compositing tools.
If someone could only bring the fantastic 3d outputs and compositing tools together, it would optimal. But honestly, Blender needs to get lot better in pre denoising, in order to do animations

1 Like

Yes, there are various motion channel based techniques, but all have limitations unfortunately. Iā€™m still working on adding temporal stabilizer, so once thatā€™s done you shouldnā€™t need to employ your own techniques. It will still have unescapable limitations of course, but with foresight most of these can be bypassed (moving parts on their own view layers for example so you can give only small parts of the scene more samples and lower the static elements samples dramatically).

2 Likes

The interesting part would be having that in compositing. So either before saving out the EXR channels or afterwards (which is more logical).
Even be interested seeing that in another software like Fusion or Nuke.

Yes, thatā€™s the bit thatā€™s holding me up. I can temporal denoise the render layer cache for all passes which is extremely slow, but means if ultra denoising mode is selected, each pass can be temporally stabilized before going out to other software. Or alternatively I could just do it for the end result (whatever goes into the composite node), which would mean it could be used for any render engineā€™s results (redshift, octane, fstorm, etc), but would also mean it would need to be re-calculated every time the tree is re-published.

I might just go with a standalone operator which works on any image sequence or movie file initially, otherwise Iā€™m going to end up going down the rabbit hole of also having to stabilise standard cache to ensure they donā€™t add any flicker when publishing in fast mode. This way youā€™d render, do the compositing either in Blender or some other softrware, and then load in the movie file or image sequence to temporally stabilize afterwards.

Well the workflow I practice since years:

  1. Denoise Diffuse total then multiply it with Diffuse color
  2. Denoise Transparent total output.
  3. Denoise Reflection/Glossy total output.
    Comp these.

Indeed it takes time, but you really need that control, because the noise is always different on each of the 3 types. GI is more a flat noise, Transparent is a complex noise and Reflection has often fireflies from specular.
So I need to take care of each individual.

I have a workstation with a fast RTX card where my Comp work is rendered, but that can take a few minutes per second, depend on the complexity of the node tree.

1 Like

Yes turbo render already does that dependant on the denoising mode chosen, as well as a load of other stuff. Itā€™s whether or not to temporally stabilize the passes individually after denosing, or whether to just offer it as a process to work on the single pass end result (image sequence/movie file). It will probably have to be the latter initially, but Iā€™m still trying solutions to make the former fast.

Hi, hope youā€™re all well. Just wanted to let you know that version 2.1.3 of Turbo Tools is now available for download from your library (link on your receipt).

This update has several performance improvements under the hood, and also fixes the recently introduced bug of cache files being kept for every rendered frame when the Turbo Comp ā€˜animationā€™ option was disabled. Now if you disable the ā€˜animationā€™ option, only the most recently rendered frameā€™s cache files will be kept on disk. This option is useful for ensuring you donā€™t use disk space unnecessarily when rendering still images, and also allows you to save disk space when rendering animations if you donā€™t need to use the full animationā€™s cache files in Blenderā€™s or a 3rd party compositor.

This reduction of hard drive space requirement also means you can use a smaller external storage device to reduce SSD read/writeā€™s if you donā€™t have a mechanical drive available.

Cheers
Michael

Just testing the addon with Blender 3.3. All seems to be fine, but if anyone finds an incompatibility please drop me an email to the support address :+1:

Decided I needed to do something a bit cooler than a kitchen for the 3d world magazine Turbo Render tutorial. My first robot! Full scene will be available with the magazine :+1:



4 Likes

Hi, hope youā€™re all well. Turbo Tools version 2.1.4 is now ready for download with the following changes:

  • Volume detail improvements.
  • Further improvements to diffuse quality in the high and ultra denoise modes.
  • Support for Blender 3.3 Alpha.

Cheers
Michael

2 Likes

Extremely excited. Temporal stabilization is almost complete, and the results are way better than I expected. It can also temporally stabilize individual passes, this means itā€™ll even work with complex compositor set-ups.

Hereā€™s a test with @RobertLe 's incredibly difficult to render train scene. Originally the scene needed 1024 samples to get decent results with neat video. Below is 64 samples with Turbo Render and Temporal Stabilizer enabled (coming soon).

Very fast to process even on my old i7-7700k cpu. Turbo Tools is available from: https://3dillusions.gumroad.com/l/turbo_tools

.
Noisy render 64 samples (too big to embed full res here):

.
Turbo Render 64 samples no temporal (lots of flicker at such low samples):

.
Turbo Render 64 samples with temporal stabilization (less than 1 second per frame):

.
Turbo Render 64 samples individual passes temporally stabilized before being processed by the upstream comp nodes. (5 seconds per frame):


8 Likes

Moved thread from #support:lighting-and-rendering to #coding:released-scripts-and-themes, as the threadā€™s purpose is for distribution, not help.

Cool, thanks :+1:

Hi Michael, Iā€™m curious about your tool, since you claim it works with features already in Blender. I guess the core of it is denoising single passes, which of course is great. But how does it work with reflections?
I mean: I can have great results with the tecnique, but as soon I look to some diffuse surface reflected in a mirror (which has no glossy albedo detail to be multiplied) all the bells an whistels are gone.

Hi, It works great with reflections:

I would be really curious about a setup or tutorial on how to use that, if you do the rendering on a farm and just the post processing with TurboR.

I mean, you click on a button and all settings are set. Then you click on another and comp is prepared, etc.

On a farm you can use command line expressions with the render called by the turbo tools operation rather than the standard render operation.

Animation:

blender -b "E:\blender\benchmark scenes\classroom\Classroom.blend" --python-expr "import bpy; bpy.ops.threedi.render_animation()"

Still image:

blender -b "E:\blender\benchmark scenes\classroom\Classroom.blend" --python-expr "import bpy; bpy.ops.threedi.render_still()"

If you want to set other things such as output directory, frame range, etc, then refer to:

https://docs.blender.org/manual/en/latest/advanced/command_line/render.html#single-image

The temporal phase is part of the publishing operation, so after rendering, the farm would need to call the publishing operation in the same way as above. The result will then be saved out to whatever you have set in Blenderā€™s output options (jpg, mpg, etc).

The farm would just need to set the cache directory prior to rendering.