Why is something like a Quadro better than say a 4090RTX?

You can, allowing for power requirements, room in the case, etc, etc.
Keep in mind that then you are limited to the VRAM of the 3080 and it’s half the speed or less then the 4090, so may not be worth it.
You could get fancy, like use the 3080 as main display/system, usage and have the 4090 as a pure render device. As such, you could have one instance of Blender render away on the 4090, while working at pretty much full speed with another instance of Blender on a different scene.
Of course then if you want to game, you need to plug/change displays, etc. Frankly likely not worth it really.

I’m sure other renderers may have different performance impacts if they are designed that way (give or take said limitations). I have a feeling that Cycles is mainly made for one or the other and any attempt to cross or mix ends up not being worth it. While one assumes if you stick to a single source of RAM and the like, performance is actually better, maybe.

2 Likes

It’s just another render engine that is integrated into Blender. Not made by Nvidia, but is currently being developed by Otoy. https://home.otoy.com/render/octane-render/

I don’t use Cycles so I’m not familiar with it in the same way I know Octane (I have used it for 14 years now). They are both path-tracers and are similar.

1 Like

Yeah, and I was thinking that your hardware setup would probably matter too. If you have your GPU connected up with 4X PCI vs 16X PCI connection that would make a difference. Some people are using 1X connections which is definitely going to slow things down.

I have 2 x16 slots available and 44 lane CPU so my assumption is I’d get full speed, especially since neither users all 16x lanes but they’d have them.

He makes a good point on raw speed, 3080 vs 4090. I guess I’ll just stick with my 4090 for now and SheepIt for the big renders.

1 Like

It really depends on your expectations and application. If you want to create precise CAD meshes with a super high density then Quadros are probably the way to go for you. If you want to create game art or render single images in the style of Blender Artist Top row as well as game with your machine then your money is probably better spend on a good RTX and a lot of RAM.

Or - if you have to ask then an RTX in the same price class of the Quadro equivalent (if there even is one) will probably serve you absolutely well enough if not better. In most cases you really need to need a Quadro in order to justify it. BUt that’s just a rule of thumb and should be considered on an actual case by case basis, of course.

2 Likes

I need to start a Go Fund me sounds like HAHA!

If you could already drop the money for a 4090 you’re probably doing fine…

If you are curious about performance differences from different GPUs, there’s always the blender benchmark open data database:

Don’t worry, you already have the best GPU available.

2 Likes

For a game, the only concern is “speed.” You must be able to render “X frames per second,” no matter what. And the player, while desperately fighting-off hordes of attacking aliens (or, whatever …) will never notice any “errors.” Really, the only thing that matters is: sustaining the “frame rate.”

Per contra, you are creating an animated movie that will be watched on enormous screens. Legions of future internet bloggers will pore over your slightest “error,” watching it frame-by-frame.

But also, "first, you’ve got to get the damned thing finished!" :slight_smile:

In a multi-million dollar project, a $40,000 “part” is just a capital expense. "If(!) it works." If you make deadline.

Well said thanks!

I would not begin to say I am any kind of technical expert on these things but worked with them for years and know what I have been told and seen.

It was always my understanding that it is about longer term stability. Studio machines will commonly be used in the day by creatives and then logged in at night to be on a render farm. They have to be able to handle a lot without over heating or blowing a fuse. During intense production phases they are never off.

I have been around game built machines used in production to save money that have failed catastrophically in this regard. I have also seen so many older studio purpose built workstations just run and run and run until they just become gently retired due to simply becoming outdated.

Gaming machines are all about maximum bang for the buck and shorter term usage. It’s speed and power over longer term stability and longevity so over clocking is common. If they fail no big deal as it is not time and money and any critical work that is on the line.

The priorities are just different. It’s a bit like using the very latest cutting edge Blender build I guess as opposed to the last stable LTS release. If you are working in production or on anything serious then you will mainly want to be on the last official stable LTS release and have as few the glitches or nasty surprises or catastrophic failures as possible.

Anyway this is what my understanding of it was. I suspect it is more complex than this though.

2 Likes

:rofl: My analogy is usually "Fishing in the Everglades with an aircraft carrier.

Well said. It’s all about “the viewer”. Enormous screen vs. big TV across the room vs. phone in your hand. And… Pixar vs. Simpsons vs. Peanuts.

1 Like

When The Hobbit was being produced, Legendary Pictures made a bet on having the movie play at 48 FPS with less film grain for more immersion. What happened instead was people arguing that the VFX and prop technology at the time was not ready to handle such a change because of how flaws and things that look ‘off’ (that would otherwise be obscured) were suddenly visible.

When resolutions go up and noise levels go down, the standards that CG must meet to be convincing goes up dramatically, and as a result the hardware requirements begin to blow up as well (to the point where companies are trying to use AI to work around resolution standards advancing faster than hardware).

1 Like

Very interesting…nice example!

When I was a kid, we had a black-and-white television. Whereas my (maternal) grandfather had … ==gasp!== color!

(“Oooh! Ahhh!” Uh huh, and what “colors” they were!) :slight_smile: Probably three.

Even today, when I watch “HD” television – at a bar somewhere, since there is actually no television (!) in my house – it really does amaze me the level of detail that we can now routinely see on-screen. “Theatrical movies” are even more detailed: “size matters” when the image is about 65 feet wide.

“So, yeah.” $40,000 for an industrial-grade device that I can utterly depend on to grind away through endless hours of rendering – faster than its competition – without melting into a heap of slag?" “I can meet deadline in three [months] instead of six?” Bring it!

1 Like

Once again, quadro’s aren’t fast. if you spent 40000 on a geforce based render farm you would render 10x faster.

I really want to dispel any magical thinking people have about these cards.

expensive != good.

2 Likes

Of course it goes without saying that rendering is done with “farms.” The more, the merrier.

I think the point being made wasn’t render farm vs single card, but 4090 vs Quattro.

I haven’t looked at the power specs and heat output of a Quattro, so I sort of don’t know what I’m talking about here. But with regard to gaming rigs and stability, it’s worth noting that overclocking the vid card or CPU and causing problems is absolutely something that is voluntarily done by the user.

So a more accurate stability comparison would be a system that is overclocked vs one is not, with the same hardware.

I do get the feeling that the Quattro is held to a higher manufacturing standard, though. And you pay for the intended pro level reliability.

2 Likes

Yeah, the original Q was really more about why use something like a Quadro (Could even be an 6000RTX, or any other pro render card) vs something like a 4090.

I have an RTX 4090; which driver is suitable for Blender, the studio or the game driver?

I always use Studio Drivers, as on the box it says the target user is people that prefer “solid” to “faster, but take your chances”.

I don’t give 2 flips about some driver update that addresses a problem with shading in Fortnite or shadow glitches in Warcraft, and most of the game driver updates always look like that kind of situation.

2 Likes