Best Graphics card for Viewport?

Hi Guys,

Have been out of the Blender Community for a while, lurking around and seeing what is new…

I’ve also had the fortune/misfortune to be an Nvidia 560ti owner, and find the viewport performance to be abysmal, starts to slow down after 400K faces, rather than millions as I would expect.

I was wondering if there was a benchmark or any information about what the best graphics card series was for Blender Viewport performance alone. Ideally it may also have CUDA, but this is not essential.

Blender certainly has made a lot of progress in the last 12 years! a shame the graphics cards went backwards a bit there.

Cheers,

well the best gpus are part of the Quadro (nvidia) and FirePro (AMD) line up. Now I have owned both, and still own a firepro… but you know what, for my CG work I dont notice much of a difference from the GPU in my primary station, which uses coincidentally an nvidia 560ti.

Unless you are using real time graphic rendering or say something CUDA based, the gpu wont really effect the polycount. Why? Because its CPU based, not GPU.

This means if you want the best performance, you need a good CPU with multi-threading. Intel is currently in the lead with that. I went the cheap route and use an AMD 6 core cpu, I have no problem displaying polygons in most applications.

For Blender itself, the viewport needs optimization, so its not necessarily going to run well compared to something like zbrush or maya when it comes to polycount.

What CPU do you use?

Moved from “General Forum > Blender and CG Discussions” to “Support > Technical Support”

I’ve heard conflicting information that the Nvidia GF 280 series would still beat a 560ti in viewport modes, so am quite interested if this is the case.

I’ve done some research on Quadro’s but the ones available in my country are either $3000, or a number of generations old, so equivalent in chip generation to the 280’s of about 4 years ago. I’m just not convinced I need a Quadro, but it is a consideration.

I’m just not convinced I need a Quadro, but it is a consideration.
It will be a waste of money for use with blender.

I’ve also had the fortune/misfortune to be an Nvidia 560ti owner, and find the viewport performance to be abysmal, starts to slow down after 400K faces, rather than millions as I would expect.
What is the spec of the rest of your system, you don’t just rely on the graphics card.

I still not understand why the blender developers doesn’t provide an option to turn all the objects single-side by default for the NVIDIA users, this is a well known problem and there’s several users affected by this problem, I know it’s not a problem of Blender but NVIDIA but this doesn’t mean that we have to suffer for this, specially when we think that Blender works better with NVIDIA cards for cycles but we cannot make the objects single-sided by default. I hear time ago that they didn’t want to provide this feature because it’s kind of a hack to make the performance to work fine but what the hell… Blender is full of hacks and provisional options and tools that we should have this for sure, now that everyone is claiming about the Blender’s performance in comparisson with other softwares they should allow us to use this method by default if we need.

P.S: I know there’s an script to do that but that you have to do it manually and for each object new you add, what I’m talking about is a default option for all the new generated objects.

If anything, there should be an option in the preferences that disables double-sided OpenGL drawing, and I do believe that is indeed warranted, since a lot of users suffer from that problem.

The problem might also go away on its own if the Viewport code is rewritten to not make use of GL_TWO_SIDED anymore, which might actually happen as a result of the Viewport FX effort.

However, the object setting is not just an OpenGL setting. It’s supposed to mean that mesh is double-sided, which might affect other renderers, exporters and so on. Changing the default behaviour here just because some subset of GPUs choke unless that setting is disabled means wagging the dog with the tail. Also, with old scenes you’d still need to disable it manually, anyway.

Ah cool, Thanks everyone. Might just grit and bare it with the current one and wait till the code might be updated.

Anything that I need to spend to much time working around is potentially not worth it in my work-flow, so I tend to not download many additional plug-ins, and don’t often change default settings.