Which renderer do you use

I still use Cycles , which renderer do you use.?

Cycles works well for me. With my machine I can render clean (denoised) 1080p frames in 3mn.

2 Likes

I wish i could render faster with cycles .

1 Like

There’s always the e-cycles or k-cycles forks, haven’t tried them but they have some success

Cycles, but will probably use EEVEE more now.

Most of this renderer comparisions just do not help me at all… alomost every user is calling the one or the other render engine the best ever ot s…t – but… they totally forget to add the context where they are using it.

And as i always say (and also in general):

It’s up to you to choose the better tool, product for your needs, knowledge and budget.

And i will never ever listen to anyone who is not explaining in detail why in which use case something is better and because of what reason the others is not, if i can not see that this user knows about both things.

For example in the video: blender can’t produce different pathes …or it’s cumbersome do so… ???
Of course someone should not do this every time he starts a new poject but simply save it for further usage !! (That’s also waht AppTemplates are for… and yes you may have to finetune this… once…)

And yes a several hundred dollar per month only renderer does things better than a mutlitool like blender. :yawning_face:

Also:
The video is not answereing the question… not even listing ( in text; you have to recognise the icons, understand the speaker or turn on subtitles ) the alternative renderers but is more a long list of (partly affiliate) links nor telling much about their pros and cons. That may be nice but a different topic.

4 Likes

I use Cycles and Eevee for my personal projects and Twinmotion professionally.

The video is missing one option that makes a difference for me, and that would be support for spectral path tracing. This is why I still use Octane and am waiting for spectral Cycles.

2 Likes

I didnt know twinmotion but i found this for faster render https://youtu.be/QBlIUtU7t1c?si=jaHK97xZ1ujoOihC

I actually have a workflow these days where I pre-bake my textures in Cycles as a “Combined” Diffuse map, then render everything much faster in Eevee. Until Eevee NEXT becomes more usable and stable, and makes optimization techniques like this unnecessary, I find that is a great way to push past Eevee’s current limitations (especially when it comes to indirect lighting and contact shadows) while using Eevee’s much faster rendering capabilities. Obviously I wouldn’t use this workflow for more photoreal rendering, I can only get away with it because my art style for my animations is becoming increasingly simple and cartoon-y, and I realized I rarely, if ever need things like accurate reflections to get a good, simple cartoon render.

I’ve tried Godot Engine, which of course is TRUE real-time rendering compared to Eevee, and in some ways it gets a result closer to Cycles in real-time due to its superior built-in, real-time Global Illumination options, but its lack of built-in Motion Blur (which Eevee actually has) is a deal-breaker for now. Unreal Engine pretty much doesn’t run properly on my laptop, if at all, and I don’t want to buy a bajillion expensive plugins for Unity just to make it easier to render animated videos from that engine.

As for just doing everything in Cycles, while it is undeniably faster now, I still think waiting 3-5 minutes for each individual frame of animation to render and DeNoise is too unbearably slow compared to 3-10 seconds per frame in Eevee. If I use it again, it would probably just be for rendering still images. I also really like the challenge of strategically learning how to use Post-Processing effects like Ambient Occlusion and Screen Space Reflections to make my Eevee renders look almost indistinguishable from Cycles renders, though of course I’m looking forward to the day where Eevee NEXT becomes more stable and I no longer need to spend significant amounts of time tweaking those Post-Processing Effects just to get fairly-convincing lighting by today’s standards.

3 Likes

Goo Engine.

1 Like

Yep – I’m almost exclusively using Cycles (outside of viewport previews), but I rarely do animation.

I typically use Cycles, but subscribe and use CrowdRender (https://www.crowd-render.com/), to make use of the three computers that I have, particularly for multiple frame sequences. Works a charm and James, one of the authors on the team, has responded to questions I have during the development of this fine tool.

If you already have made combined bakes like this for the main objects of a scene (ideally as 32-bit images containing the full lighting), there is a way to use that bake to massively accelerate some Cycles renders with almost no visual difference from what it would have been. It helps especially in enclosed interior scenes.

You just have to go in the material and setup the bake so it’s only visible to diffuse rays. Then, you set the material’s “emission sampling” to none, as we don’t want the bake to actually behave like a light source.

The bake won’t be visible to the camera but will instead affect the indirect lighting, allowing it to be much cleaner. An advantage of this is that there can can be a little bit of change in the scene (like a character moving around) and the shadows will work correctly. An other advantage of not seeing the bake directly is that you can use vertex colors for the bake if the object has enough density to support it, and you don’t even need to put that much quality in the bake for it to work.

Here are the results. The second image uses the bake trick. It’s not only cleaner but also took less time to render. It won’t match Eevee’s speed, but it looks almost identical to a regular Cycles render and can make almost any scene doable in Cycles as long as the lighting can be baked.


3 Likes

Eevee next is missing the bloom option
It depends on what materials and lightning you use for your cartoony animation otherwise the render wil also slow down with eevee.

That video is missing quite a lot of different use-cases. It felt clickbaity, made especially for engagement farming. Sprinkled with product placement for whoever pays the most to be mentioned by that channel.

Since a long while, if I use cycles, it is mainly for baking textures.
As of now it is in this order;
Godot - where I make my games in.
Eevee/Eevee next
Cycles

I won´t use external renderers anymore. External renderers are for my purposes in the category of legacy software but without the charm legacy software can bring.
If it isn´t inside Godot or Blender and tightly knitted into the system, it is not worth my time.

Dealing with license servers, the import/export, quirky implementations, bugs, lack of support, plugin handling, ability to cost efficiently and quickly expand and render on more PCs, having to regularly sent financial statements out to some of the software developers, changes to Eulas and ToS, etc. Is not worth the hassle.

Many years ago I tried to mainly use Keyshot and Octane only because of marketing.
The lures like in this video of;"it is an InDuStRy StAnDarD " where sadly convincing for my past self due to effective usage of fear, uncertainty and doubt.
Totally negating that the industry is under constant turmoil as well as consistent change.
What is todays InDuStRy StAnDarD can quickly become tomorrows Legacyware as well as a timecapsule.

8 Likes

Argh… now I get PTSD from the days when radiosity was effectively the only option for GI Blender ('s internal renderer) supported natively.
Are we really willing to be that backwards for a bit of rendertime-savings? :sweat_smile:

greetings, Kologe

1 Like

Don’t get me wrong, if Cycles had a better way to do irradiance caching (like luxcore’s photon cache), I would definitely use it over this wonky trick.

But I have actually been able to complete renders in Cycles that my computer would have really struggled through otherwise. In some interior scenes, you can slash the render time by half or get clean renders in cases where you would otherwise need path guiding to finish the render at all.

So if I can trick Cycles into doing radiosity, I personally will, though I am very familiar with Blender’s baking system, so I will admit it’s not for everyone.

2 Likes

Just curious, Cycles is a path tracer so it does diffuse reflections (GI) out of the box. You don’t need to trick it to do radiosity as diffuse reflections are much better and far more accurate. Radiosity is so much more difficult and time consuming to setup as well. I’m not sure you would gain much more time in going that route? Also the only way you could add some kind of radiosity to Cycles would be a OSL script (maybe?) or modify the code.

Well, what I’m doing might not exactly be radiosity, more like some renderers’ secondary solver/irradiance cache. For example, this is something Luxcore could do by default with its photon cache system.

By pre-baking all the lighting and making that bake visible only for indirect bounces, you are:

  • combining all the indirect light bounces into a single one, greatly reducing the number of rays that need to be traced. As soon as a diffuse light ray hits an object that’s baked, it will instantly be ended and resolved, while a regular ray would need to keep bouncing around and test its visibility with light sources.

  • Pre-processing millions of light paths through the scene. As soon as a light ray hits a baked object, it will hit a baked pixel that already had hundreds of light paths calculated, so you have basically made that diffuse ray count for multiple at no cost.

  • Guaranteeing that every light ray will succeed and contribute to the image. Usually, when rendering in Cycles, some light rays will get stuck in corners and fail to find a light source at all. You can see this easily if you do a 1-sample render, some pixels will be black, as those rays failed to properly light the scene. By pre-baking the light, every area of the scene gets light values that can be used, especially if the bake is denoised. This means that any diffuse ray that hits the baked data is certain to add at least some light to the image and won’t be a failed sample. You can easily see this in the 2 images I posted earlier. The one without the trick has many black pixels and the one with the trick has continuous lighting everywhere.

You would be surprised how little difference there is if you do the trick properly (bake the lighting in full combined mode so you have all the light, 32-bit images/vertex colors). The main difference once the render is completed is that it will look like it was done with more light bounces and will be a little bit brighter.

2 Likes