Next-Gen GPU

hey everyone, it’s running wild on the internet lately
it seem that the rtx 3000 series gpu are leaking ? we got specs and photos.

cartegraphiquegpunvidia7nmampere3080ti

Card                  RTX 3080ti     RTX 2080ti
Architecture           Ampere         Turing 
Cuda Cores             8192           4352
RT Cores               256            68
Boost Clock            1750           1545

Seem a bit high to me (3.7x more fast with RTX rendering hmmmm.) . Maybe fake, take it with a grain of salt. They have no reason to pull that up since their main opponents are still 6ft underground.

Still waiting for eevee + RTX btw

edit*

3 Likes

It may actually mean that it’s a fake, but also that AMD will release very fast cards with very fast raytracing cores.
At present it is not very smart to buy RTX cards.

Each year AMD brings massive hype and always fail to deliver (in the high end).
I doubt that this year will change.

4 Likes

It’s long been rumoured that the first gen of RTX was a fledgling technology and the next gen would be a BIG step up. It’s also well known that AMD is targetting the high-end with it’s RDNA2 architecture and nVidia is keen not to lose the crown.

So these leaks, if true, are not a surprise at all.

2 Likes

The x80Tis have been 102 dies in recent times so I seriously doubt the 3080ti is a 100 die.

If nVidia have been forced to move the 3080Ti to a 100 die then they must expect AMD high end Navi to be extremely powerful. The 5700XT was extremely good for a tiny die size and the noise around the next-gen consoles suggests the RDNA2 clock speeds are much higher. Simply extrapolating 2x 5700XT plus increased clock freq then add in a bit extra per for IPC gains and you’ve got a very competitive GPU.

I think Navi will be extremely capable but the 3080Ti will be a 102 die and nVidia will factory over clock the balls off it and break their usual 250W TDP to beat it…just. I think Navi will be significantly cheaper than the 3080Ti making it the best bang for buck in gaming.

*The 3080Ti may be known as the 3090 BTW. This seems to be the rumour ATM.

If they are able to pull their threadripper tricks all over again but for gpu, Hooray then :partying_face:

But well I’m still sceptic. …Followed AMD gpus announcement from the 900’s series, and each years it’s the same ‘‘next gen amd cards kick Nvidia high end ass’’ bs…

Isn’t RTX first gen in the 2000 series? I can believe higher gains on the second generation than following ones.

1 Like

nVidia will probably beat the Big Navi in performance, the tech tubers’ consensus is that Navi will be very competitive but nVidia will probably beat it by breaking their usual 250W TDP limit.

nVidia know there’s a lot of fans who’ll be prepared to pay $1200 for a card that’s 5% faster than a $999 AMD card. nVidia have huge mindshare, they managed to sell the 2080TIs which were basically beta hardware for £1800 in the UK, they’re still selling some brand 2080TIs for £1500 which is what I paid for 2x watercooled 1080TIs. Insane pricing and I don’t expect the 3000 series to be cheaper.

I hope AMD come close as if nVidia continue to dominate there will be >> £2000 cards before long. I’ll be very happy to buy a couple of Big Navi GPUs for my new workstation come October if AMD delivers this time.

1 Like

did they ? I mean in a gaming standpoint, everyone hated rtx

curous to see what all this RTX technology will become if AMD also release a solution, i hope they will work together for once. having multiple ray tracing solution will be nigtmares for devs

AMD’s drivers are just awful, that’s the only reason why I don’t use them. I don’t care how good the card is if my system isn’t stable.

3 Likes

I’ve heard the drivers are much better of late, but Nvidia have a lot better software support in general, with cuda, rtx in 3d apps like Blender and DLSS, which should provide a large benefit to games that support it. If AMD don’t improve their overall software support (not just drivers) then regardless of whether or not they can meet Nvidia’s hardware performance, I think Nvidia will still have the advantage.

1 Like

I don’t think AMD decides who supports their cards or not, right? Maybe in the world of Open Source but beyond that companies will decide what system they want to support.

I’ll be curious to see the benchmarks from the render engine that will be supporting both hardware companies equally, Octane RTX and Octane X.
Once Octane X is out of closed Beta (CEO has said Octane X is full featured and done, and releasing very soon) and the next generation of cards are out (Big Navi and 3090’s) then we can get and idea of the two different cards power.

Trying to compare the two companies with the Cycles render engine, while BF seems to be moving away from OpenCL and fully to Optix, seems kinda unfair.

Still cheap when you think that 18 months ago you needed a $100k rig to run the early RTX tech.

1 Like

Maybe you know for Coreteks YT channel. https://www.youtube.com/channel/UCX_t3BvnQtS5IHzto_y7tbw/videos
Not usual YT tech channel, this dude go in deep, although he is like others more game oriented when come to consumer hardware. He have couple videos about Nvidia / AMD GPU’s . So I do not think that AMD will waste his resources to small market like 3D creators. They are more interested in sever - gaming part of market. Also is worth to mention that both Nvidia and AMD play game with customers ( prices ), they already was on court because under table deals. So we will see, but I do not believe that AMD is more customer friendly than Nvidia and Intel.
We will continue to pay high prices for high end GPU in future.

AMD is investing heavily in the Blender Dev fund, they are also investing heavily in ProRender for all DCCs, Brian Savery AMD is developing a Blender USD importer and Hydra render delegate. AMD is creating Hydra delegates for ProRender for all USD supported DCCs. I’m sorry but if you think AMD is not interested in 3D creators you’re pretty ill informed. The only reason for ProRender’s existence is to sell AMD GPUs to 3D creators.

I think ProRender’s Hybrid rendering modes make it one of the more interesting Renderers currently in development. To go from Eevee, Eevee + ray tracing right up to full path tracing all in the same renderer is unique and incredibly useful from a production perspective.

Pro Render is not that popular tho,
i wonder why :face_with_monocle:

The statement to which I replied suggested AMD was not interested in 3D creators which is patently false. AMD is investing heavily.

ProRender 1.0 was indescribably horrible in terms of performance but it was definitely an aesthetically pleasing renderer much more so than Redshift for example. ProRender 2.0 is a significant improvement in terms of performance and functionality it has come a very long way.

I’m not concerned in the slightest by ProRender’s popularity, I don’t need the validation of others to determine what fits my needs. I’m going to reserve judgement until I see ProRender running on the Navi GPUs.

1 Like

If the RTX 3090 does come with 24GB of VRAM, I’m getting it day one! Maybe there will be a new Titan or not, thats far off in the future. Also rumor 4-5X ray trace performance increase, if it pans out will be HUGE. You normal only get 40-60% increase in rasterization with new cards. I’m always going to be a nvidia fan, they make the best(though expensive) GPU and CUDA is super fast. Also Optix :stuck_out_tongue:

AMD needs to give nVidia more competition on the high end so the prices come down to earth. Intel jumping on the discrete GPU market may shake things up too.

2 Likes

Pro Render? Who use it anyway… He is from beginning integrated in C4D, and NO ONE use it for serious work. I try it in R19 and this is joke… better result can be achieved in Matsuba. btw. AMD “accidentally” make Pro Render AMD GPU exclusive… he run on Nvidia but very poorly, even on best Nvidia. Very, user friendly, right :wink: I do not know do they make some progress ( I’m stuck on R19 ), but as far I know no one take Pro Render seriously.
Be real, Nvidia have big budget, and more importunately they are GPU only manufacturer, they do not split his focus/resources like AMD. Do you know how many tools Nvidia develop during years?
Look like you do not watch videos I suggested to you… well they are slow and full of tech details, maybe too boring to you. AMD are interested in all segments, but they simply do not have enough resources to cover all his products. For now they are more focused on other stuff.
I’m opportunist, AMD or Nvidia who care, I’m only interested in price/performance/applicability. Nvidia have many haters, but people forget that they are same… just search on YT video where AMD - Nvidia shady business practice are debunked. So maybe my next GPU will be AMD, who know?

1 Like

Despite what you think ProRender is one of the more widely deployed GPU renderers. You seem to be exercised over a few percentage points. GPU based rendering is still eclipsed by CPU based rendering in the real world.

It’s funny you saying ProRender runs badly on nVidia because one of the sticks used to beat AMD is that ProRender runs better on nVidia GPUs so take your pick.

Yeah Intel have a budget 10x that of AMD and look where AMD CPUs are today. You seem to be declaring an nVidia victory based on their budget without seeing either company’s new GPUs. I am going to reserve judgement until they’re released.

You have absolutely no idea what I watch on YT, the only one who is slow appears to be you. You failed to see the huge investment AMD is making that directly 3D Creators no matter which brand of GPU you use. AMD is investing in OPEN cross platform tools and APIs which should resonate with Blender users more than most.

I applaud the work @bsavery and team are doing on ProRender and on the USD importer and render delegate for Blender. This is hugely beneficial for everyone from high end pipeline to hobbyist exchanging USD assets on a shoestring all for FREE.

3 Likes