Best GFX card for Rendering?

Hi guys

Might splash out soon on a new gfx card, is there a specific type of catrd suited best to 3d rendering? OR will a 8800 be just as good?

Thanks for any help!

since blenders renderer is software based, it uses the CPU and not the GPU, unlike the 3D view port which uses hardware rendering for its display.

meaning, that no matter what GFX card you get, the only improvement you will get will be in the 3D viewport and possibly the 2D interface.

if you want to speed up render, i suggest either upgraded to daul or qaud core CPU with atleast 2-3GB of DDR2 or faster RAM with a fast HDD and motherboard (FSB etc).

i would suggest the 8800 for a new graphics card though since they are top qaulity!

About that 8800, wait for the new generation that can tackle directx10.1
About the rest daniel has it spot on.

Its worth noting that the 8800 is a nice card but blenders limitations are design driven, not hardware. The display pipeline is very inefficient for UI rendering so throwing hardware at it will only increase its performance a small amount compared to the true potential of the hardware.

Blender internal does not use a GPU at this time for rendering but maybe in the future it will.

extreme,
you could look at it this way. A high end gfx card is a must if you work with large polygon scenes. Say you have a street corner with a few high detail buildings including street lamps, wires, street lights … the works plus a few high poly vehicles. Only a very good gfx card can rotate the scene without hick ups and even then, there are limits.
A way to work around this is to use layers and hide/unhide what you’re working on to keep the work flow at a decent speed.
As others have pointed out, the cpu and ram take care of the rendering.

If you watch the Ratatouille trailer (even some link on here?), you can see that mighty Pixar used a lot of matte painting for the background scenes, especially in the kitchen. I’m getting carried away here … I’m sure you get the idea.

Have fun with your new card!!

Simply put:
Graphics Card = For viewing your models and scenes in Blender’s 3d-view.
CPU - Processor = Rendering your models and scenes into a picture.

The ONLY situation you would need a better graphics card for rendering is when you would use a renderer that uses the gpu, such as the nvidia Gelato. But Blender’s renderer does NOT use you graphics card, so you can just forget it and go to sleep. Do it. Now.

And you guys… I can understand people making a typo with quad > qaud (QUAD is the right way) but with dual…? DAUL?
Darth MAUL, Darth MULE? no… it is unacceptable. Daul Precosser and Qaud Cero.A DEADLY COMBINATION!

im sorry, DUAL.

but at the end of the day, does it really matter? Its an internet forum…

2extreme: if you wouldnt mind, whats your current spec?

Well, actually your’re correct that typos like “daul” or “qaud” don’t matter, but I think it’s best for everyone if we try our best to write good english so that nobody learns how to write wrongly.

I think you mean “write well english.” :stuck_out_tongue:

yes, we should all speal proper english like wot i do.

i guess all thoose harping on about the CPU dealing with the rendering havent considered Gelato which uses the GPU instead. I believe it now works or is possible to make work with blender.

The 8800 cards will be able to handle Direct X 10.1, though some features will be unavailable to the current gen of DX10 cards.
Its not like the current cards will be obsolete as soon as 10.1 is out.

There has been allot of speculation about this, so a Microsoft representative issued a statement the other day stating that the 10.1 update will be just like the old DX9 updates in the fact that they would be backwards compatible, though there would be content that would require a future hardware update to be utilised fully.

Its not like AMD and Nvidia are tripping over themselves to throw out new cards for 10.1 as they cant get the drivers correct for DX10 yet :confused: plus the backlash from all the people who have already bought “next Gen” GPU’s would be massive. They will bleed the current cards dry before releasing something else. My money is on mid 2008.

DX10 games are thin on the ground at the moment as for most Dev. company’s the return on a DX10 only game is to small in comparison to the outlay needed to create them.

It will be at least a year or two before we start seeing DX10 required games at the same level that we see DX9 required at the moment.

So i would buy a 8800 GTS 320mb if you are not sure what to do, you can get them now for about £170, and the performance is not shabby when compared to the GTX price :slight_smile:

Like everyone else said, rendering in Blender is purely CPU based so getting a new GPU won’t have any effect on rendering time. HOWEVER, if you get an nVidia card, nVidia offers its Gelato render (which renders images using the GPU) for free on their site. It isn’t integrated into Blender, though, so you would have to render externally.

SamAdam

No, I don’t think I do. Well is how you do it but good is the result of doing it well. I would say I speak english well, and not I speak english good. But I think I can say I speak good or proper english.

ladies ahem i mean gentlemen, lets get back on topic.

2extreme: have you tried using gelato, it is a hardware based renderer, meaning it will use the GPU resources to render out frames, it is many times faster than software renderers i.e. blender internal. Im not sure if all the features of blender are supported though. i know there is a thread on here that has walk throughs for exporting and an exporter that is regulary updated.

has anyone noticed 2extreme has not replied for a few a while. this is the kind of post that only really needs one reply to it.