I want to play a little game with making completely unsubstantiated (wink) claims about next-gen GPUs and see how far off I will be in a year. Please do not take what I’m writing too seriously . Also post your own prophecies.
5090 will easily break 25000 score in Blender Benchmark. FE will be 4 slot, have two 16pin connectors and have 24GB VRAM. MSRP - 1899$.
Top RDNA4 model will finally tie 3090 in Blender. It will tie it in price too. This will make it only 3 times worse in terms of perf/$ than 5090.
ARC Battlemage will outsell RDNA4 2:1. It will also have worse performance per $ in Blender than Alchemist. Top model will have 12GB of VRAM.
It is rumored that this may have been the intended cooler design for the RTX 4090Ti (which Nvidia canceled, possibly because the technology to create something even more powerful without a prohibitive TDP does not yet exist).
What does that mean for predictions? Well, there is an opening for a product that is rather efficient on thermal and power while not compromising on VRAM and bus-width.
AI craze swallows all chip capacity and raises GPU prices
Nvidia chases it, leaving gaming and any other compute workload sitting in the sidelines
AMD can also raise prices because the vacuum in gaming and other compute, but their software stack is still very lacklustre for anything outside gaming. It doesn’t matter if they have in theory more GFLOPS, if nobody can easily use them.
Inflation will again raise its head, increasing GPU prices some more
At least that’s how I see it. Might be 100% wrong. Who knows. The AI companies at this moment can’t find any H100 (all sold out), so they get 4090’s.
Software researchers discover an entirely new rendering paradigm (say “paradigm” a lot and you’ll get a promotion at work) that is 10,000 times faster than anything before. This makes gpu’s obsolete and hardware companies instead compete to provide peta-byte memory capacity and tera-bit bus size to feed planet size CG scenes to the software. Nvidia and AMD use AI to create advertisements claiming use of the new software causes leprosy.
That gaming and CG enthusiasts (high end PC rigs) can’t even compete with sectors such as finance-medical-research-industrial-military-AI. Gaming and CG sector is only a tiny fraction in terms proportion and even have miniscule strategic importance because is considered leisure-entertainment.
Also this has a side effect that since more and more effort and research is placed into A.I. is inevitable that AI-based solutions will thrive from now on.
On one hand, we can definitely say that hardware has already peaked and can’t get any faster.
Then the focus immediately shifts to neural networks, because these systems, have the tendency to hold pre-calculated values and offer immediate answers that do not require computation (provided that there is significant training first).
In some ways this switch to AI seems legit and it really helps. However the real question is about what happens from now on, if the switch to AI-oriented graphics is more than 70% and becomes permanent. Going by this logic this marks the end of WYSIWYG type of graphics representation and I have no clue about what dangers this brings.