The future of 3D may not be RTX after all, disappointing sales

Nvidia is starting to find out that people are not buying into the RTX hype near much as they expected. The computing community suggests it is due to the lack of games using it and in part due to people not willing to spend over 1000 bucks for a high-end model and nearly 500 for the mid-range.

However, Nvidia is insisting that the only reason Turing is not selling well is the economic downturn in China. In other words, they believe the technology and the prices are fine the way they are.

From my view, I think it was a good thing I managed to get a Pascal card before they were discontinued, as the 20xx series currently does not seem seem like a good deal across the board.


Apple is discovering much the same thing concerning its latest (and, most expensive) iPhones: there simply comes a point where people are not willing to spend $1,000+ for a telephone, no matter how “wonderful” it is. (And, even if it really is.)

Also, people realize just how quickly “wonderful” becomes “the norm.” The iPhone that’s in my pocket right now was “the latest thing” a mere three years ago and it cost four times as much, then, as I paid for it at the phone store – yet, it’s exactly the same piece of equipment, and as “wonderful” as it ever actually was.

1 Like

with me being able to pick up a brand new Vega 64 air boost for $390 compared to an overpriced 1070 ti (what a joke) for $459 (lowest I have seen) brand new… I can’t see Nvidia improving any sales soon

edit: plus the fact that the crypto-mining is no longer as strong

It is reasonable to expect that as underdeveloped countries develop, there will be increased demand for technology and economic growth. The problem now however is whether or not they will develop at all given the rise in global instability.

I always hesitate when it comes to Radeon GPUs but there is Radeon VII with rather insane 16GB of memory on the horizon and I think I will go team red for the first time. I wanted to ask - is your Vega 64 reliable? No crashing while rendering or any other Blender-specific problems?

…it takes “anything” I throw at it… but it’s power hungry…
I am not a real tech savvy/geeky person… but I can tell you that it does better than my main rig with a simple 1080 … but that might be because I can not get along with CUDA and its antics when dealing with VRAM especially errors that it throws which I would never see when rendering with an AMD using OpenCL. edit: And I have not experienced any out of memory with AMD making the GPU crash either.

observing how ‘gaming’ PCs are steadily being replaced… and i think Sony will again deliver the final blow (as it did with portable radios (vacuum tube VS transistor), LCD TVs…), but let’s see what this year’s “The Rise of the Game Consoles” brings.
Also, with recession approaching I highly doubt NVidia will become an option for the masses - it’s more of an elitist prestige.

1 Like

Well its not actually a secret that China is a bubble waiting to burst. The question is only when and how.
I would not put too much attention on the market if I were you mainly because there is a new kid in the block and is here to eat your CPU and GPU in nanoseconds.

The new kid is called AI and it needs a ton of computation , it adds a ton of value into application and its growing very fast. Modern AI heavily relies on GPUs and CPUs , so if all fails and suddenly the game market stop being demanding, Hollywood stop making heavily CGI movies , you can still bet on the rise of expensive GPUs for the simple fact that the software industry dwarfs Hollywood and Game industry combined.

Economies, markets, technology and life itself has its up and downs. So its business as usual.

When it comes to humans, enough is never enough.

Well the RTX might fail for Gaming, unless this year we get like 20 games with Ray Tracing and the DLSS upscalling, that show worthwhile performance.

I can’t agree that RTX is not the future (note I don’t have a single RTX card). It is the next thing, just maybe not 2018/2019. more like 2020 :slight_smile:

Afterall Quake 2 is an impressive ray-tracing showing. :slight_smile:

RTX 2060 came out a week ago and it looks promising. For that price point it’s really good. However buying GPU over €400 is madness just for gaming. The only thing holding it back is the 6GB of memory.

1 Like

Yeah, I’m sitting this generation out. I’m not buying into an expensive product that will be outdated by the time the software support materializes.

From an Otoy (Octane) tweet:

Experimental OctaneBench 2019 (with RT Core + Vulkan) pushes the RTX GPUs into the 400’s without OC. Pure shadow raytrace speed (10-40% of total OctaneBench score) is now up to 3x faster on the same 2080 GPU with “RTX on”.

At least for GPU rendering, RTXs will give an excellent ROI. Will likely pay for themselves, and then some, for 3D artist who render a bunch. I plan on getting three this year.

1 Like

yup, and eager to see if Blender and other rendering engines implement this support.

It seems a lot of companies, when they become market dominant, forget the basics of economics. Like, the sweet spot of maximum profits is normally not found at a greater price per unit. The higher the price point, the more potential customers you exclude.

As noted by others above, this is really important in tech products, where people are very aware that this year’s awesome is next year’s mid-range.

For instance, the PC I built late last year has a Ryzen 2700x in it. Because I could afford one.


I had been strongly considering getting a Vega card, but changed my mind and got a 2070 for when the free version of Octane is released. So far I’ve been happy with it. Yes, it is expensive (and over here we pay far more than in the US) but it has helped to speed up my rendering which of course means getting work to clients faster.

I think it’s jumping the gun to say that due to struggling sales the technology is a failure. What would probably be more accurate to say is that Nvidia should have waited until everything was ready for a huge RTX launch instead of rushing to the market with an expensive new product no one could take advantage of. If they waited a few more months for some actual RTX games they’d probably be better off.

So basically the fault is Nvidia’s, not RTX itself.

I just read this:

I’m not exactly happy that I was suckered into buying an “iPhone 1,” which didn’t even have a GPS, just because I thought that Steve Jobs’ (RIP :cry: …) sales-pitch was so damned good.

But I can say this – “I will never spend $1,000+ for a telephone!” (Even though I have an engineer-friend who snaps-up every new model and thinks nothing of the price in his quest for “cool.”)

“Build it, and they will come?” Well, “usually not.”

1 Like

It’s even possible to get a 2x speedup for all CUDA cards now, for much cheaper and it will make Blender awesome for all in a year. But it’s top secret.


a lot of supply and a shortage of demand means cheaper cards for me.


Because people know they would pay 1000$ just to become a beta-tester of this new technology which will reveal it’s full potential years after.