Question about graphic card color depth(GeForce 5500)

I have just bought a new computer with a GeForce 5500 at 1280x1024 32 bits color depth.

Ok, when i see a gradient(for example the gradient in the shadows of the default image in blender) it doesn’t appear very well, like when you are in 16 bit color, that seems banded and not a continuous one…

When i switch to the old computer (Nvidia Riva 128) at the same resolution and 24 bit color it looks better.

Any ideas, is there something wrong with the GeForce? it gets worse when you play a compressed video for example.

The monitor is the same for both test (HP 19’’ TFT) maybe the monitor cannot take 32 bits…

Does anyone know if i can set the colordepth to 24 bit?

Thanx in advance

if i recall correctly, there is no difference on the display between 24 and 32 bit color

the extra 8 bits go to the alpha channel [which you can’t see]

also, 24 bits isn’t enough for smooth gradients either [though it is smoother than 16 bit]. a patch [perhaps put into 2.35?] was planned to dither renders so that this problem would be nearly invisible.

/EDIT: not sure anymore/

I thought 24 was 6 bits R 6 bits G 6 bits B and 6 bits Alpha.

So you should be able to tell the difference.

I Guess the monitor doen’t support it but you’ll have to try and see.


I don’t mean to insult your intelligence, but have you gotten the latest drivers?

I surfed the web a little and i found that the old drivers were a little better when it comes to smoothing gradients :o

Yeah, i have the latest drivers…the DVDs look like hell, the DivX, etc even worse…

I don´t think it’s a monitor’s problem because i have used the same with the old computer…

Anyway, thanx for the answers