Hello, I’m considering of buying a different graphic card and the question I have is any that is not nvidia or ati related ? Would Real 3D be one of them ? One that can run GL, Mesa and X.
Al, if you wait a month or so, I will be getting my hands on a new 3Dlabs Wildcat Realizm; I’ll be able to tell you if it’s a good card or not. 3Dlabs is a little pricey, but they’re not Nvidia or ATI. The Wildcat VP was a piece of crap, but Realizm looks pretty nice. Apparently, 3Dlabs will be writing Linux drivers for this card, but I don’t know if that’s just BS at this point.
Overall, 3Dlabs is an excellent company – great support, great hardware – but I can’t just recommend them to you. They’re on probation because of crappy drivers, but the Wildcat VP series looks like it was a fluke.
Ok, please keep me inform, anyone else here have different cards ? I would like to know.
I have used machines with graphics cards that have the oddest names but IMO, I would stick with nvidia or ati. Developers make software to be compatible with them so if things go wrong on other cards, support is not easy to get. Is there a reason why you’re thinking of other cards?
Comparability, the cost of mass production, I think a 3rd party maker would make it more compatible because they don’t attend to the masses. May I remind you there are high end cards out there, for programs like wavefront, sci, etc. Maybe a look through a cad or computer artiest magazine might show a few.
Yeah, I’ve seen quite a few but it’s still usually ATI or nvidia cards. Also the argument that 3rd party makers have more compatibility than the popular cards is rarely the case.
The fact is that there may well be cards that outperform even the high end ones in some cases but ATI and nvidia have the best price/performance. Gaming benchmarks will normally only compare the 2:
I don’t know, it seems these are gaming cards and not desighn for the 3d desighners. I do some searches and get back to you with this.
3Dlabs shipped my card today, so I will be able to evaluate it quite soon.
cool, I’m looking forward to it, anyone else with third party cards ?
I now have a 3Dlabs Wildcat Realizm 200 in my hands, and it looks like it’s ready to destroy anything that Nvidia or ATI put up against it. WWF Smackdown style. Seriously, though: I am trying to review this card as objectively as I can performance-wise and image-quality-wise; if anyone has any suggestions, I will take them into consideration.
Does it come with source code, how well it works with linux ? Is it easy to set up ? How well it works with video ? How much detail can be seen with image ?
My initial testing came out with the following results, which I placed in an email to 3Dlabs:
I have installed the Realizm and the latest Windows 2000 driver, and it does very well for itself. I like the following:
- The availability of a precise z-buffer, even in Entertainment mode.
- The extremely insane number of vertices that can be handled. I crashed my computer by adding vertices, and was not able to make the interactivity in Blender completely unusable. I was able to texture and light more than 1.8 million vertices and faces with some measure of responsiveness. As a torture test, I textured and lit 2.8 million vertices at 4x antialiasing, and the scene was surprisingly responsive.
- Videos seem to play correctly. I have yet to try playing a DVD, but I will do so and see what happens.
- For good measure, I checked that text would scroll at a blazingly fast rate (a recursive ls command in Cygwin ran very well).
- International fonts in Blender (my primary 3D modeling application) render perfectly – they appeared white, instead of black.
- Non-selected buttons in Blender dim correctly; they didn’t on my Wildcat VP 760.
- WGL_ARB_pbuffer is implemented, as well as a slew of other WGL extensions. This is excellent.
- DirectX 9 is supported, which is very important in today’s environment.
- Texture compression is implemented, which adds versatility.
However, it does have a number of issues. I do not wish to emphasize these more than the card’s strengths, but I will need to elaborate on them so that you will be able to either duplicate the problem or understand exactly what the problem is.
Only one color depth – 32 bits per pixel – is supported. Thus, certain tests in dxdiag cannot be completed successfully. I don’t know of any professional CAD or animation packages that use 256 colors or 16 bit color, but (Heaven forbid!) game emulators do, and some random old application might. Implementing low color depths is a lot easier than making sure that a 16-sided concave polygon with noncoplanar vertices works, so these might as well be supported.
AGP texture acceleration is not supported. The card can hold a lot of textures on board, but getting 512 MB across any bus, no matter how fast, takes time. It would be best to minimize that amount of time by any means necessary. Grand Theft Auto III was very jerky on my Wildcat VP 760 because of the lack of AGP texture acceleration, so large visualizations with different textures in different areas might also lag when the textures have to change. However, I have yet to find out what exactly AGP texture acceleration does – if it does something useless or something that 3Dlabs thinks is pointless, then I’m OK with this not being implented.
3Dl2Svc.exe – the 3Dlabs LMM service – uses 27 MB of memory on a regular basis, with a maximum (so far) of 33 MB. After Doom 3 crashed, its peak memory usage showed up as >350 MB. Its CPU utilization on a single processor system would often reach 80% – it reaches 40% on my dual Opteron. What is it doing, and why does it consume huge amounts of resources? The CPU spikes often happen when I’m Alt+Tab-ing between windows.
Mozilla Firefox and Mozilla Thunderbird behave strangely; they both lag a lot when rendering and switching windows. They are the main applications in which I have noticed issue 3. However, Internet Explorer is not affected. After some more testing, it looks like they only lag when the whole window has to be redrawn. Putting the task manager or some other small window over them, and switching back does not seem to slow them down.
In MS Excel, adding a red image works, but after switching away from the window and switching back, the image shows up with an orange tinge. Moving the image around brings it back to red, but the problem repeats if I switch away and back again. This might be a Microsoft issue, in which case it’s a lost cause, but it might be an issue with your driver. I will try to investigate this further.
Hardware gamma is behaving strangely. I am trying to play Quake III and Doom 3 – OpenGL games written by John Carmack, the OpenGL god.
In Quake III, the game appears very dim, and brightness controls do nothing. Setting the cvar r_ignorehwgamma to 1 (which ignores hardware gamma) solves the problem – but this involves offloading work from hardware to software, thereby shifting the burden from the graphics card to the CPU.
In Doom 3, videos look fine, as does the reticle and other 2D objects, but in 3D mode, only lights render. The game appears completely black except for lights, rendering it completely unplayable. I was able to temporarily fix the problem by setting the rendering mode to ARB instead of ARB2 or best, but the ARB rendering path does not support specular rendering and many other features included in ARB2. The game also seems to crash frequently, and I have experienced two crashes at the same point in the game. Since it’s not a spoiler (I don’t know if you play Doom 3 or not), I’ll tell you that point: depressurizing the airlock to walk outside Mars City to find the lost scientist.
Overall, the Wildcat Realizm excels at what it is designed to do, and I have only seen driver-related crashes in Doom 3. I don’t expect this card to trounce an ATI Radeon X800 in Doom 3 or in other games, but making games playable on the card would be a huge plus. I will perform more extensive testing, but at first glance, the card looks quite good.
Not much good news, how it runs under Linux ?
i have run 7 million polys in a scene in blender on my Nvidia GF4 128Mb AGP 4x without problem.
its a little slow but still useable.
2.8 million is nothing LOL.
Holy shit (not bullshit I hope) Alltaken :o. 7 million polys!!! And that’s just a 4x AGP you say. Man, I seriously need an upgrade. I still have a crappy Radeon 7500 16MB. I thought it was sailing ok but it won’t do more than 65000 polys real time :(.
I was getting interested in the Wildcat cards until meetstaplu’s results. I wouldn’t touch 'em now. From looking around, it’s close between the ATI Radeon X800 and the nvidia geforce fx 6800 ultra.
Like Alltaken points out, even those GF4 cards still pack a punch, well the Ti ones anyway. I’m not entirely convinced about his 7million poly test. I read someone managed to do about 1.5 million textured polys in real time on an older GF4 Ti.
From the specs page: http://www.nvidia.com/page/geforce4ti.html
the top GF4 Ti 4800 8x can do 136million verts/second. Now, I don’t know if this is the right way of doing it but to redraw in real-time, that means 136million/25fps = 5.4million polys. Ok you might get away with 15fps in a draw window = 9million polys but that’s the top card so 7 million seems a bit dubious (screenshot plz?). Were they textured polys?
Anyways, this site gives some advice on cards based on price:
The spec of the geforce 6800 ultra = 600million polys/sec = 25million polys real-time. The 6800 and x800 are on the scale of post-production quality in real-time.
Here’s some graphic card that’s third party, it’s far fewer then I expected to find. Any of them good ?
I wouldn’t know, although I hear Matrox is ok.
What I do know is this: if you want a good video card for Blender and a few games than a Geforce 4 440MX will be fine for you. If you are more of a gamer, than I’d consider the budget option to be the ATI Radeon 9200. I now have both. My GeforceMX card handled OpenGL like a pro, but my ATI card is a bit slower with OpenGL- even though it has a faster core. But, at the moment, until I can afford to purchase a top-of-the-line Geforce FX card than I’ll stick with the ATI 9200. It has DirectX 9 support and it can use a tv as a monitor allowing me to play games on a tv and another treat… recording my 3d animations to tape!
I just ran 8.85 million polygons. This card eats vertices for breakfast. There were tons of intersecting polygons, nonplanar quads, and smooth shading.
I then tried to duplicate my horrible shapes, and Blender said:
Blender crashed before I reached a usability limit.
Now that’s impressive.
Which card you referring toward ?
Bottom line: NVIDIA and ATI are the best graphics cards makers out there. Period. You may be able to find some obscure 3rd party one, but it wouldn’t have all the features of NVIDIA’s or ATI’s cards. Personally i like NVIDIA better, they have better driver support (and i think they have linux drivers) and way more features. They’re also geared towards OpenGl, which works perfectly with Blender. The Geforce 4 TI cards are as little as 60 bucks now, and i can run any game with medium to full graphics and blender runs perfect on my geforce 4 ti 4200 128mb. Why would you not want one?