EVGA ends partnership with NVIDIA

Official info from EVGA Management:


There is extensive news coverage of it, so I don’t think it’s necessary to link it here.

This has the potential to ripple through hardware industry.


Did not see that coming.

That’s gonna hurt a lot of people.

Wow, Nvidia must have really p*ssed in that pot. And EVGA is just gonna quit graphics cards altogether? Hm. There’s something more than conflict with Nvidia here.


I have never been their customer, and I don’t have any sentiment, but this is bad for the market.

In the interview with the media EVGA CEO mentioned lack of respect from NVIDIA and I’m going to believe that, but as always there are two sides of the story. I think they were planning this for long time. They quit EU market and laid off some personnel in Taiwan some time ago. Add to that crypto crash and stock oversupply.


The GPU market has been crashing for a long time, and EVGA was never at the top of the list of “really good GPUs”. I’m not surprised at all. I’ve read that crypto-mining is now obsolete - something about a merge? - and that’s also going to tank GPU sales. EVGA was always more focused on the flash and trash anyway, in my opinion, I personally won’t miss them

1 Like

Hmm, I heard exactly opposite. They covered 40% of US market, and supposedly had very good customer support. Trashiness-wise they also were the last to adopt RGB lightning.

The most profitable GPU-mine-able crypto (Ethereum) switched to different method of transaction authentication which do not require so much computation power (~99% less).
Ether is second largest crypto in the world, so everyone will feel it.

On the darker side there are still a lot of other smaller (less profitable) cryptocurrencies that can be mined on a GPU, so this is not the end of mining as we know it but very big reduction.


Ditto. I’ve always thought of EVGA as being among the top tier of GPU manufacturers. I’ve never heard of anyone having a problem with one of their cards.


I just don’t like the way they look :sweat_smile: either way, I see this as a good thing- if Nvidia wants to keep its partners, they’re going to have to drop prices, and if the market is cooling down, the partners are also going to have to drop prices


About prices Anandtech has a very interesting chart:


Well, yeah. 95% of all that hardcore gamer stuff looks goofy anyway. EVGA isn’t as bad as some in that regard, but they’re still offenders.

Hell, I still wonder why my RAM has the word VENGEANCE on it in big, bold letters. What’s it mad about? Why does it need revenge?


I feel like most of these things were named by twelve-year-old boys who don’t really know what the words mean but they heard them on Call of Duty- why does my GPU have afterburners? Why does my MOBO have “battle ready steel”? This is why I really like be quiet! as a brand- no fuss, just a distinctly German practicality. I, like you, have RAM with a bloodlust, but other than that, my computer is subdued, practical, silent, and upwards of 80 pounds, just how I want it :slight_smile:

Anyway, back to the topic at hand… MSI makes really good GPUs, in my experience


I guess it’s for peace of mind. For those moments when someone has a go at your computer with a sword, you can rest assured that the battle ready steel can take a few licks before breaking.

That’s mostly me, except I want a computer that’s lightweight, and doesn’t take up a lot of space.

I actually came across a machine recently that’s basically high end laptop hardware in a desktop chasis. It’s a variation of one of Intel’s NUCs, with an i7, and a mobile 3070. It was surprisingly inexpensive as well.

The only problem is that, since it’s a gamer NUC, it’s got that dorky ass skull logo on it.


Well, gaming oriented hardware is obviously targeting younger ppl with their products, they need all the attention grabbing titles and graphics in place for them to sell to a wider population that isn’t necessarily tech-savvy.

…just in case the person missed all those graphics on the packaging, he can still be reminded about them no matter where he looks on the actual hardware…

Compared to professional hardware that have none of that and is usually a simple boxy shape that gets the job done.


Yeah. I like that kind of design a lot. AMD did great job with it in Vega series too.

And since this is EVGA thread:



I had nothing but good experiences with EVGA after purchasing three of their cards over the years. Held them in high regard myself. Sad to hear that news.


Well, you see it was expecting to end up in a high performance street car, having the wind blowing through its nano-printed circuits as it screamed down the highway with the beat of ‘We Will Rock You’ blasting out on the in-car surround sound system…

Instead, COVID hit, all travel ended, no one bought cars and all where stuck at home, needing a new PC. So now that RAM is sitting in a black box, on the floor, half covered in dust as cat fur starts to clog-up the air filters with the ‘driver’ yelling at it for being a slow piece of junk as he dies again from a head shot in CoD.

So cut the RAM a break, you’d be pretty mad and want some VENGEANCE too.

As for EVGA, will be interesting to see where the company is at in 2 years time or if anyone even remembers who they are.

In the case of Nvidia, I very much expect they couldn’t care less, and it will be business as usual.


There’s plenty to be vengeful about! pick your favourite offense, and scream VENGEANCE while setting gpu overclock to 2000%


No wonder it’s so angry. Who drag races to Queen? They’re great, sure, but they’re not really a band you can Tokyo drift to.

…well, maybe Bicycle Race could work. I could imagine tearing up the streets in a Lambo to that.

The one thing that concerns me the most is, if my RAM does decide to take it’s vengeance, can I be charged as an accessory in whatever it ends up doing?

This is something I should’ve asked myself BEFORE buying it, but hey, hindsight’s 20/20, right?


The only issue is that it is often assumed by the manufacturers that professionals and creators have a big pile of money just begging to be spent (so they get some additional features and then sold at an enormous markup).

Otherwise, the gamer LED-laced bling is your only option unless you stick to the bargain basement (not near as fancy in looks either as the audience is soccer moms and work at home dads, who only need the power to run social media, email, and office apps).


Let’s break it piece by piece and see if it’s really that “enormous” of a price difference (Taking as an example the RTX A5000 vs the RTX 3080).

  • Performance wise, they are pretty much equal.

Screenshot from 2022-09-18 13-08-43

  • Price wise, the 3080 “looks” significantly cheaper (using an EVGA 3080 to stay on topic ^_^).

  • On the power consumption front we have:

    • RTX A5000 TDP = 230W
    • RTX 3080 TDP = 320W
  • Now for some basic math:

Let’s say we run them 24/7 for some 5 years (skipping one generation, unless you like upgrading every other year…), the power consumption (in kwh) will look like this.

RTX A5000

Screenshot from 2022-09-18 13-37-30

RTX 3080

Screenshot from 2022-09-18 13-37-20

And let’s say you are living in one of these countries (UK for example, using 2021 prices in $ for 1 kwh)

Screenshot from 2022-09-18 13-41-33

RTX A5000: 2014.8 x 0.325 = $654.81/year
RTX 3080: 2803.2 x 0.325 = $911.04/year
911.04 - 654.81 = $256.23/year
256.23 x 5 = $1281.15 (more using the 3080)

Then we compare the extra VRAM price of some $12 for 1GB, (24GB vs 10GB, that’s 14GB more).

12*14 = $168
Now when we reflect these two factors on the original prices we get:

  • RTX A5000 = $2,274
  • RTX 3080 = 749 + 1281 + 168 = $2,198

And suddenly the price no longer looks that “enormous”.

This without factoring in all the extra bells and whistles that comes with professional class GPUs, just to name a few:

  • ECC memory to avoid those “failed/full of artifacts” renders saving you more time/money.
  • NVLINK support across the board.
  • vGPU.
  • Solid drivers.
  • The peace of mind that those GPUs have gone through a lot of harsh tests to make sure they won’t die on you prematurely (this video sums it all up).

Now with the current energy crisis and prices in some EU countries almost doubling, it’s only going to be more and more in favor of less hungry GPUs.



If you can actually afford to double or triple your spending on a new machine (for the purpose of making a bet on the cost/benefit ratio) then yes, go ahead and buy the professional-grade hardware without the LED lights.

However, upfront cost is still a major deal for a lot of people, and I at least will never resort to taking a loan or signing a payment plan no matter how much math is thrown out there in an attempt to convince otherwise. Then there’s the fact the BF is not even close to tapping out optimization possibilities to make Eevee and Cycles faster and more memory efficient on the gaming cards (with the story being the same for Godot as well if you also want to create games).