I'm upgrading my 9800 gt to a Radeon HD 6950 1gb.

since I’ve been using blender Game engine quite a number of times lately… I guess I might as well upgrade my Gpu… I’m going for a Radeon HD 6950 1gb. will blender game engine benefits from this upgrade? my current gpu is the Nvidia 9800gt 512 mb, I am planning to replace it with an Ati HD 6950 1gb.

also my monitor is only a 19-20 inch monitor and the resolution is around 1600x900 does it matter what monitor I’m using when it comes to blender game engine? I mean… If I were to use the HD 6950 1gb with a 1600x900 monitor for gaming that would be total overkill right?

yes, I am planning to upgrade my monitor… but I don’t have the money now… I’ll buy a new monitor the same time when Elder scroll: skyrim releases.

will blender game engine actually benefit from the upgrade? and do I have to buy a better resolution monitor to accomodate the radeon 6950 1gb graphic card I’m planning to purchase? I’m planning to buy a HD 6950 1gb mainly for blender game engine for now… not for PC gaming.

It doesn’t matter what resolution monitor you have, provided its >=800*600, because that’s pretty poor anyway.
Just a note:
A few features such as display lists, and GLSL filters don’t work as well on ATI. You can get some very nice graphics cards such as the GTX 460, for £100 or less, which is not bad indeed.

is blender going to have problems running on Nvidia 4xx and 5xx series cards?

Any examples?
Any card from 2007+ will be fine.

I uh… is it breaking the rules if I bump my own thread? ha…

could you link me to an article or thread regarding this problem with Ati cards? now that you mention about ati cards having problems with GLSL I am becoming more and more reluctant to buy an AMD graphic card… and considering the problem with Nvidia newer cards… what the heck should I do? buy an old gtx 200 series? but… that isn’t going to do me good with newer games coming out for PC…

I guess it all comes down to the same old question: what is the best gpu that suits blender best?

I can recommend the GTX 460 1 Gb, its a really good card for the price and it doesn’t suck up a load of juice either. I used to have a pair of 8800 Ultras and my PC was using nearly 400W. Now its around 200W!

As far as problems, Blender has minor issues with GLSL (from what I have read, esp filters) but its nothing major.

ATI cards does not have problems with GLSL, on contrary ATI cards should perform better with openGL and gl shader language in theory. Its just, ATI are very strict in writing on glsl for their cards.But that is not a problem, that’s because they want the the code to executed faster and…etc. So if you learn the ati restrictions for glsl it shouldn’t be a problem to make great glsl game on an ATI card, that runs perfect on GeForce too.

do you have any experience using blender with ATI cards?

I have never owned a ATI card, I’m only really telling you what has happened before with shader authors like Martinsh who sometimes had trouble with ATI cards and the shaders he wrote.

hadime is correct that ATI tends to be better with Open GL, as Nvida performs better with Direct X. But both cards will have problems- the hardware is sound its just the drivers that sometimes cause trouble. To be honest I would’nt worry too much about it.

hey, thanks man I appreciate the help :slight_smile: I’m a bit paranoid when it comes to purchasing stuff… because I wouldn’t want to have problems after I’ve bought an expensive Card :stuck_out_tongue: anyway… thanks for the advice.

I think I worry too much.

have you ever used an Ati gpu for heavy blender usage? animating,game engine,modeling,sculpting,etc…

Yes, 90% of my game is done on ATI card.

what do you mean by: “learn the ati restrictions for glsl”?

Ati GLSL compiler have some code syntax restrictions.
Things like converting all integers to floats(1 to 1.0) and so on.If your glsl shader does not work on ati, its most likely because of that and a small glsl code modification will make it work.

yes! an experienced Ati user! thank you! your game that goes by the title KRUM does looks promising…

thanks again. :slight_smile: