Nvidia Ampere: RTX 3000 graphics cards set for August reveal

1 Like

Rumours suggest the upcoming RTX 3080 Ti will be 40% more powerful than the current RTX 2080 Ti. Considering the latter is currently the mot powerful consumer graphics card by a big margin, this news is incredibly impressive and could potentially see frame rates reach new levels for 4K gaming.

they say that the RTX3080 will be +40% the perfs of the RTX2080
so roughly +25% compared to 2080ti perfs.

since both the PS5 and Xbox Series X will support the technology

smells like bs.

This just in! :

https://pokde.net/system/pc/gpu/rtx-3080-ti-2x-rtx-2080-ti/

I look forward very much on RTX and tensor part even though I have 2070s, having seen Quake 2 RTX , Minecraft RTX and Control (with DLSS 2.0), I can hardly go back to graphics without ray tracing feature.

Wouldn’t quite trust the quadrupling of RT cores to also mean quadrupling RT performance since just like CUDA cores they might not be apples-to-apples between Turing and Ampere. But looks pretty good on paper. Finally something worthwhile to swap my 1080Ti for.

I’m interested in the RTX 3060. I hope the prices will compete with AMD.

In terms of that article, I anticipate the biggest leap in performance will be in the area of RTX (because the 2xxx series was just gen 1). I would be surprised if the cards have double the performance for traditional scanline graphics.

I was very impressed with the Performance increase when i swapped my old 1070 for a 2070. It rendered some scenes nearly Twice as fAst.
Added a 2080 shortly after and was disappointed as it is only marginally faster than the 2070 while being a lot more expensive.

Not quite sure what you want to tell me here, but sure, I agree, a 2080 isn’t very good bang-for-buck. I don’t do much rendering on the computer I work on (I use cloud services when needed), so I don’t feel very strongly about having the latest and greatest rig.

That said, the new Ampere cards ought to be quite nice for playing Cyberpunk on the bigass 4k oled I use instead of a monitor, so I’m looking forward to that.

I don’t believe that slide that says the 3080Ti on the GA100 chip, I just don’t see it. xx80Tis and Titans have been on the 102 die historically and I’m pretty sure nVidia haven’t used the full die for any consumer card, at least not for many years.

If I’m wrong there is only one reason why nVidia is doing this and that’s because they fear Big Navi. If Big Navi with 80 compute units+ RT is > 2x performance of 5700XT nVidia might need to roll out the 100 die to the enthusiast card level to maintain the performance crown but they’ll certainly lose out on price/performance.

I do hope AMD have found a way to compete and shake up the GPU market like they shook up the CPU market because the quickest way to $2k-$3k consumer GPUs is to let nVidia’s domination to continue.

Just surprised that the 20 series was not worthwile. But if you don’t render on your workstation that is understandable.
I usually use cloud services as well but earlier this year I had a project with prohibitively large caches so I had to render most scenes at home and there the 20 cards were worth every cent.

1 Like