Double the power, but double the watts and a potential fire hazard. It does not look like any notable architectural improvements will be offered if this is the case, just heavier engineering to help the cards survive a much higher level of voltage and heat.
Of course, that is providing you can even run such a machine in your home, a higher end two GPU setup may not even be possible because of the limits in what a wall outlet can provide (so you are talking about thousands of dollars worth of home wiring upgrades before purchasing).
The only good news is that oftentimes, previous generation cards from the same vendor drop in price significantly after the new models are released. The RTX 3060/3070 cards might actually be at MSRP by then.
To conclude, the ball is in AMDâs court now, whether they run with it remains to be seen.
What is surprising is that Nvidia is not even attempting to avoid creating their own version of AMDâs Vega disaster (which was one of the worst GPU architectures in the last decade). Now AMD is piling R&D into cutting edge chip building tech. while Nvidia becomes the one brute forcing everything.
Of course, thereâs speculation that AMD will also push their MCM cards to an extremely power hungry state if the gaming audience is now fine with 1 kilowatt systems (for nothing more than to finally give the Radeon camp their chance to brag about the FRAMMMEZZZZ!!!). What about those of us who just want a solid card for heavy creative work that we know will last for many years?
Donât get a quadro, get an RTX geforce and youâll be much happier with blender.
If you are doing oil speculation or heavy machine learning, the quadros might give you a bit of a boost, but they are built for stability at the cost of performance. Paying lots for low performance isnât ideal.
If this is true, then cooling the thing will be next to impossible unless you build a custom environment with a high-powered HVAC system (because your room will get hot and the PC no longer has cool air for the intakes).
Not only that, but these âjokeâ fans will have to become a real product as well (if you can even find a case that will hold them).
Iâd be happy with healthier competition in this sector. Iâm about to buy a 3090 and itâs already a power hog, soâŚ
Obviously this would have to go hand in hand with raytracing performance for our usage here. And the software part would have to follow suit, letâs see what HIP holds in that regard.
5nm process should bear sizable gains in efficiency and performance.
Lovelace, however, is expected to be a monolithic chip, albeit packing a remarkable 18432 CUDA cores, clocked at up to 2.5 GHz. This could mean performance thatâs up to twice as fast as the current flagship GeForce RTX 3090, potentially making high framerate 4K gaming a reality.
I highly doubt it . . I got a GTX 1070 for MSRP price of 2016, and I was lucky, it was a great deal, the card is going for more than that, so I donât think the 3000 series will get a significant price cut tbh
It is obvious that someone is trying to bring the prices down by badmouthing them.
With a 5nm process this should be a good improvement over the current 3xxx series.
The NVIDIA AD102 âADA GPUâ appears to have 18432 CUDA Cores based on the preliminary specs (which can change), housed within 144 SM units. This is almost twice the cores present in Ampere which was already a massive step up from Turing. A 2.3-2.5 GHz clock speed would give us up to 85 to 92 TFLOPs of compute performance (FP32). This is more than twice the FP32 performance of the existing RTX 3090 which packs 36 TFLOPs of FP32 compute power. https://wccftech.com/nvidia-geforce-rtx-40-series-graphics-cards-ada-lovelace-gpu-tsmc-5nm-2022-launch/
I hope they succeed, the prices are insane.
I just bought a new computer and the GPU did cost more than the CPU+Cooler+Mainboard+RAM+SSD.
Its nice to see that AMD is able to build proper gaming GPUâs but I have not much hope that theyâll able to match Nivida when it comes to GPUâs that are great for 3D work.
I havenât bought an AMD graphics card since 10+ years and I fear that it wonât change soon.