I just installed a GeForce FX 5500 video card (with latest drivers) in my Celeron P4 2gHz machine hoping to get better performance than the built-in video adapter. However, when I render I get stray pixels on object and background…any clue as to what is wrong?
My last video card was a geforce fx 5500, I also had render problems with blender, I tried a lot of different test (as well as other people) and never found a solution , tho try not to use the world backround as much as you can (use photo backrounds), sometimes that helps. I dont know if its just that model of nvidia card, im breaking in a gt 7300 right now, I havent noticed any problems , but I havent done much rendering as of yet to be honest. I think blender may have problems with nvidia cards tho as I have seen other posts related to problems with nvidia cards.
Have you tried rendering the object using the command line call to blender and then looking at the final product? That by-passes your video card all together until you look at the end result.
Thanks for the reply…that helps. Besides the card you are using now, what are(is) the video chip set/video card model of choice for Blender? Something from ATI or Matrox?
Unimatrix, I’m not familiar with rendering from the command line…could you explain?
rendering uses the video card? When you save your image does it still have the errors?