Nvidia, stop being a DICK (YouTube video) what do you think?

Well that might have been true in the past but you might not remember the stir it created when NVIDIA sold nearly old designs slightly rebranded as new.

NVIDIA and AMD are like ATT and Verizon - both having rigged the system. Fortunately with mobile plans there are providers now that offer cheaper and better plans. And ATT and V adapted those. One has to consider that T-Mobile as an underdog impacted them so much that they changed their plans.

But with 3D Cards there is no 3rd party. I am still madly upset that NVIDIA did ship the 970 with the wrong technical specifications. That was not an honest mistake but a blatant lie because besides the VRAM also the render modules are different.

My problem with tessellation is that I believe it should be angle based: The sharper a corner is, the more vertices should be added… but flat areas should not be subdivided pointlessly.

Ideally, it would be awesome if there existed a video card that could render surfaces as curves rather than triangles altogether, so smoothness wouldn’t take any special effort to process at all. Sadly it’s not how any current rendering technology was designed to work, and prolly impossible to do with both OpenGL and any existing hardware.

AMD rebrands cards also, look at the progression of the 7000 series to the 300 series, they all use the same GCN architecture albeit slightly updated, and don’t forget about how difficult it is to compare mobile GPU’s to desktop GPU’s from both companies.

I agree that Nvidia messed up when they shipped the 970 but the card was and still is a great buy considering how much it costs for the performance it gives.

Yes yes I’ve seen the video

Personally I think it’s Nvidia’s decision on whether or not they should share their tech

It’s kinda like what we always say arround here to people demanding that vray should be free…

Just because the soup kitchen is giving out free food that doesn’t mean mcdonalds is a dick for not giving out free burgers

Or in this case,
Just because AMD is making everything open source that doesn’t mean Nvidia is a dick for not following suit.

I’m not making excuses for anything and I stand by what I originally said.

The amount of tessellation in Crysis 2 was overkill and everyone acknowledges it. It wasn’t only the insane amounts of tessellation on objects that didn’t need it, it was the frigging tessellated ocean below the ground too. Crytek either dropped the ball hardcore there or they partnered with Nvidia to showcase a totally pointless strength.

No, not everyone acknowledges it. Again, just because it looks wasteful to you doesn’t mean it is inefficient. There’s been an uninformed news article that showed supposedly “wasteful” rendering and all equally uninformed gamers sprung up in outrage. This has been discussed by a developer on the CryEngine forums:

  • Wireframes make surfaces look more tesselated than they really are
  • No backface culling in wireframe means triangles show up from both sides
  • The ocean under the surface is not necessarily rendered, despite showing up in wireframe

Also, sometimes it’s cheaper to just draw something than figuring out whether it needs to be drawn.

Either way, it wasn’t a big deal, the difference in performance on AMD vs NVIDIA was reported as only around 10-15% on the higher tesselation levels (with no difference on the lower ones).

Nvidia ensuring their customer’s cards run games as good as possible seems to be the opposite of a dick move to me. I don’t see why it should be Nvidia’s job to make sure AMD customer’s cards run games well too.

Frankly, I think even if Nvidia did Open Source all their tech, AMD would still have problems. They have an incredibly poor track record regarding driver quality. My current system has an AMD card; my next system won’t.

On OS X I had AMD a lot and never faced an issue and since some time I only have NVIDIA for OS X and Win. Nicely NVIDIA also offers their own driver for OS X so thank you to them. This way you can use CUDA.

I just noticed after the next Win10 update that Blender is incredible slow on the second display (has its own NVIDIA card) but I would say that is a Win10 issue?

I am staying with NVIDIA specifically just because of CUDA in Cycles.

Huh?!? :eek: The 970 has 4 GB of memory, it’s just the last half gig that is slower than the rest. It doesn’t affect anything until you get to that last half gig. And as far as I have read the slowdown isn’t very much (10 to 15 percent).

No, and that’s the whole point. AMD GPUs tanked and still tank in unnecessarily tessellated games. You have the recent example of Witcher 3:

https://forums.geforce.com/default/topic/834905/real-reason-why-witcher-3-hairworks-reduces-fps/

x64 factor, just wow. And x8 MSAA on top of that. Let’s remember that the Extreme preset at Unigine is x32.

Nvidia is capable of bringing even their own GPUs to their knees just to annihilate AMD. Go elsewhere with your propaganda.

I dont see the problem here really:
1.) nVidia is undoubtedly providing the best experience on every OS and especially for LInux.
2.) They are benefitting hugely from massive R&D they have done plus essentially kickstarted the whole GPGPU programming.
3.) They capitalize on this. Seems like a perfect success story to me.

And the market agrees: They can ensure high turnover with great profit. AMD has simply not had it since the Geforce 8800GTX and since then its basically all knee-jerk reactions from them. Do they make bad GPU? By no means, no. But why wouldn´t you just get the best if you are tech-savvy and affluent enough? I never settle for second best and that is why it´s nVidia for me until AMD pulls off another Radeon 9700 Pro.

Hardware without software to support it is a paperweight, as Palmer Luckey put it recently on Twitter.

Something AMD should have been aware of.

that 9700 pro comment was a major flashback for me. that was an awesome card, and a total game changer.

I’m an Nvidia user since the Riva TNT 2. I’ve never once been disappointed. The cards have price-appropriate performance and justwork*, which is all you can really ask for.

I got burned by nvidia on the 8800 line. They used some inferior solder that would crack over time. then when the chip would heat up, it would all of a sudden lose contact on one of the pads and glitch out and crash. I had a buddy with the same chip and the same problem.

This. NVidia also does this to sell their newer gpus while trying to make the older gpus look bad. So it’s not only AMD they are trying to screw.

Also things like because NVidia’s architecture has bad performance with async compute that means that developers are not going to use it yet the inverse is not true. NVidia’s cards are great at tessellation but AMD’s cards strengths has always been in their shader pipeline. That’s going as far back as the 9800pro. So there is room for concern because with Nvidia’s near monopoly in the discrete gpu market that means that an important aspect of DX12 will probably never get used until nvidia catches up. Meanwhile nvidia gives developers black boxes like gameworks to actively make their competition look bad.

While nvidia is working on proprietary garbage like gsync, AMD works on freesync and hbm.

I had the 9800pro but yeah that range of cards was awesome at the time.

Yeah I burned out an NVIDIA GT 8800 card on an iMac 2008, and the MacBookPro 2010 has an NVIDIA GT 330M GPU that breaks down over time. The MacBookPro GPU I could have replaced for free but learned about it only after three years after I bought the laptop.

So two cards Nvidia did crap work.

Crazy thing is that stores want around 300$ for a 2008 Nvidia GT 8800 card lol …

You burned a Nvidia AND a Mac? You truly are a hero :smiley:

(I know, I know… just had to make this bad pun)

Those two GPU cards are known for having manufacturing defects.

So over use the computers and the cards will break down.