First details on Nvidia Lovelace; Dead on arrival?

By the looks of it, all AMD has to do to win the next GPU face off is to have drivers that do not suck. While they are going to a flexible MCM design with RDNA3, Nvidia will rely on brute force.
Twice as powerful but double energy usage - Nvidia GeForce RTX 40 graphics cards (

Double the power, but double the watts and a potential fire hazard. It does not look like any notable architectural improvements will be offered if this is the case, just heavier engineering to help the cards survive a much higher level of voltage and heat.

Of course, that is providing you can even run such a machine in your home, a higher end two GPU setup may not even be possible because of the limits in what a wall outlet can provide (so you are talking about thousands of dollars worth of home wiring upgrades before purchasing).

The only good news is that oftentimes, previous generation cards from the same vendor drop in price significantly after the new models are released. The RTX 3060/3070 cards might actually be at MSRP by then.

To conclude, the ball is in AMD’s court now, whether they run with it remains to be seen.

That’s going to be tough since they’ve never accomplished this. :smiley:


Nvidia housefire vs AMD drivers.
The old meme strikes again.


You could always make a headless machine, and plug it into the dryer socket down in your utility room.

Simple problems have simple solutions, people.


Or put it outside:


You remember the story about that kid who built a nuclear reactor in his backyard?

…we could really use his expertise right now.


What is surprising is that Nvidia is not even attempting to avoid creating their own version of AMD’s Vega disaster (which was one of the worst GPU architectures in the last decade). Now AMD is piling R&D into cutting edge chip building tech. while Nvidia becomes the one brute forcing everything.

Of course, there’s speculation that AMD will also push their MCM cards to an extremely power hungry state if the gaming audience is now fine with 1 kilowatt systems (for nothing more than to finally give the Radeon camp their chance to brag about the FRAMMMEZZZZ!!!). What about those of us who just want a solid card for heavy creative work that we know will last for many years?


Well I guess you can always go Quadro / Radeon PRO. If you have the $$$.

1 Like

Don’t get a quadro, get an RTX geforce and you’ll be much happier with blender.

If you are doing oil speculation or heavy machine learning, the quadros might give you a bit of a boost, but they are built for stability at the cost of performance. Paying lots for low performance isn’t ideal.


If this is true, then cooling the thing will be next to impossible unless you build a custom environment with a high-powered HVAC system (because your room will get hot and the PC no longer has cool air for the intakes).

Not only that, but these ‘joke’ fans will have to become a real product as well (if you can even find a case that will hold them).

1 Like

I’d be happy with healthier competition in this sector. I’m about to buy a 3090 and it’s already a power hog, so…
Obviously this would have to go hand in hand with raytracing performance for our usage here. And the software part would have to follow suit, let’s see what HIP holds in that regard.

5nm process should bear sizable gains in efficiency and performance.

Lovelace, however, is expected to be a monolithic chip, albeit packing a remarkable 18432 CUDA cores, clocked at up to 2.5 GHz. This could mean performance that’s up to twice as fast as the current flagship GeForce RTX 3090, potentially making high framerate 4K gaming a reality.

I highly doubt it . . I got a GTX 1070 for MSRP price of 2016, and I was lucky, it was a great deal, the card is going for more than that, so I don’t think the 3000 series will get a significant price cut tbh

@Ace_Dragon @silex

rtx 4090 benchmark

not only that, but get on the level of Optix with their cards

Just to clear things up, in case you haven’t watched much of it… this is an april fools joke, and a good one

1 Like

It is obvious that someone is trying to bring the prices down by badmouthing them.
With a 5nm process this should be a good improvement over the current 3xxx series.

some more rumors:

The NVIDIA AD102 “ADA GPU” appears to have 18432 CUDA Cores based on the preliminary specs (which can change), housed within 144 SM units. This is almost twice the cores present in Ampere which was already a massive step up from Turing. A 2.3-2.5 GHz clock speed would give us up to 85 to 92 TFLOPs of compute performance (FP32). This is more than twice the FP32 performance of the existing RTX 3090 which packs 36 TFLOPs of FP32 compute power.

I hope they succeed, the prices are insane.
I just bought a new computer and the GPU did cost more than the CPU+Cooler+Mainboard+RAM+SSD.
Its nice to see that AMD is able to build proper gaming GPU’s but I have not much hope that they’ll able to match Nivida when it comes to GPU’s that are great for 3D work.
I haven’t bought an AMD graphics card since 10+ years and I fear that it won’t change soon.


Yeah, i am also in Nvidia bandwagon since 2013.
I hope that Intel arriving to gaming with their cards will lower the pression a bit.

A solution if you don’t want to pay premium prices is to buy a laptop.