2024 GPU market predictions

I want to play a little game with making completely unsubstantiated (wink) claims about next-gen GPUs and see how far off I will be in a year. Please do not take what I’m writing too seriously :slight_smile: . Also post your own prophecies.

  1. 5090 will easily break 25000 score in Blender Benchmark. FE will be 4 slot, have two 16pin connectors and have 24GB VRAM. MSRP - 1899$.

  2. Top RDNA4 model will finally tie 3090 in Blender. It will tie it in price too. This will make it only 3 times worse in terms of perf/$ than 5090.

  3. ARC Battlemage will outsell RDNA4 2:1. It will also have worse performance per $ in Blender than Alchemist. Top model will have 12GB of VRAM.

Also take a look at a possible next generation (industrial size) cooler from Nvidia.
Unseen Nvidia Quad-Slot GPU Cooler Prototype Revealed with Additional Heatsink Fan (guru3d.com)

It is rumored that this may have been the intended cooler design for the RTX 4090Ti (which Nvidia canceled, possibly because the technology to create something even more powerful without a prohibitive TDP does not yet exist).

What does that mean for predictions? Well, there is an opening for a product that is rather efficient on thermal and power while not compromising on VRAM and bus-width.

I am not exactly sure about 2024 or 2025, but in general I feel that there is a turn towards AI for all problems related to speed.

I can see for example that AI upscaling is now considered the norm, whoever tries to render in 4K might be out of their mind.

Next the most reasonable thing is to attack the problem of raytracing, and use AI as well in order to optimize speeds.

This is more of a wishful comment than a prediction, but I really hope there will be more competition on the GPU market. Like, Intel ARC, come on.

I mean, ~$15k for a RTX 4080. lol, nope.

4 Likes

For the next 12 months my guess:

  • AI craze swallows all chip capacity and raises GPU prices
  • Nvidia chases it, leaving gaming and any other compute workload sitting in the sidelines
  • AMD can also raise prices because the vacuum in gaming and other compute, but their software stack is still very lacklustre for anything outside gaming. It doesn’t matter if they have in theory more GFLOPS, if nobody can easily use them.
  • Inflation will again raise its head, increasing GPU prices some more

At least that’s how I see it. Might be 100% wrong. Who knows. The AI companies at this moment can’t find any H100 (all sold out), so they get 4090’s.

2 Likes

We are going to need a bigger boat!

Software researchers discover an entirely new rendering paradigm (say “paradigm” a lot and you’ll get a promotion at work) that is 10,000 times faster than anything before. This makes gpu’s obsolete and hardware companies instead compete to provide peta-byte memory capacity and tera-bit bus size to feed planet size CG scenes to the software. Nvidia and AMD use AI to create advertisements claiming use of the new software causes leprosy.

2 Likes

More or less I get the same vibe in general.

That gaming and CG enthusiasts (high end PC rigs) can’t even compete with sectors such as finance-medical-research-industrial-military-AI. Gaming and CG sector is only a tiny fraction in terms proportion and even have miniscule strategic importance because is considered leisure-entertainment.

Also this has a side effect that since more and more effort and research is placed into A.I. is inevitable that AI-based solutions will thrive from now on.

  • On one hand, we can definitely say that hardware has already peaked and can’t get any faster.
  • Then the focus immediately shifts to neural networks, because these systems, have the tendency to hold pre-calculated values and offer immediate answers that do not require computation (provided that there is significant training first).
  • In some ways this switch to AI seems legit and it really helps. However the real question is about what happens from now on, if the switch to AI-oriented graphics is more than 70% and becomes permanent. :thinking: Going by this logic this marks the end of WYSIWYG type of graphics representation and I have no clue about what dangers this brings.