Good for RDNA3? RDNA 2 refreshes show AMD topping the charts for the high-end

Technically, the new XT cards are refreshes, but there are significant improvements in areas like power/performance.
Sapphire Radeon RX 6950 XT Sapphire Nitro+ Pure review - Introduction (

While Nvidia RTX is superior in raytracing, for everything else AMD’s RDNA2 flagships provide significantly better power consumption, cooler running temps, quieter cooling in general, and better overall performance. Then the kicker, the price (while still high) actually makes it look like a pretty good deal compared to the RTX 3090 Ti.

What could this mean for RDNA3 with its MCM design? I would think that all AMD has to do now is improve the raytracing performance and continue their work on making their drivers more robust. Either way though, it shows the GPU wars on the high-end is back and should hopefully lead to better prices across the board (as it should lead to a price war too).

The signs appearing to point at AMD becoming a viable option for Blender again (as far as the GPU goes) could not come soon enough.

1 Like

How optimistic of you, I hope you’re right, but I remain skeptical until I see it.

Nvidia already has nothing in the ~1000 USD price range that even comes close to this card in rasterization performance (so Jenson Haung will now have to explain why gamers and creators would like to pay such a hefty premium just for Optix). This fact becomes even more of an issue for Nvidia when noting the amount users will save on their power bills.

That’s good to hear. If gamers and other users in the industry adapt to it, it will send the right message. I truly hope AMD can repeat with GPU’s what they did with CPU’s a couple of years back - nobody really wants an monopoly except the monopolists.
Quite frankly I can’t wait until Nvidia gets humbled by true competition.

In my opinion, just like any new technology, AMD and it’s MCM design will drive the prices higher rather then lower (New tech, new opportunity to capitalize on it until the competition catches up) same thing as what Nvidia did when they first launched RTX.

Hello, i want to get rx6700xt for my blender, unreal engine work but i know nvidia is better but it is more expensive for me.
so my question is will the rx6700xt preform well ?

With a score of 1219, you’ll do just fine in Blender with a 6700XT

the 3060ti is almost triple the score… so what does the score means like what is considered low range and what is considered high range like is the 3000 an overkill score ?


Benchmark Score


The Blender benchmark Score refers to the hardware’s ability to process samples during Cycles rendering. In particular it’s the amount of samples per minute that a CPU or GPU can compute. The higher this number, the better.

That was not the case with the original Ryzen, the only reason why their prices are a bit higher now is because they now have enough marketshare to justify an increase in the profit margin.

AMD’s GPU division is still in a pretty bad spot in terms of marketshare, even though RDNA has allowed a lot of recovery from the release of one of the all-time worst GPU’s to ever go on sale (Vega64).

It is worth taking into consideration that Nvidia card has Optix, a ray tracing aiding technology enabled, that is why you see 3060ti so far ahead. Optix roughly doubles the performance of the card that would normally run on CUDA. Same sort of boost is coming to the AMD cards, however, there is not a specific date when that may happen. Rumors has it, sometime this summer?

That being said, if 3060ti is a bit expensive, perhaps a 3060 is worth a look. It would be a bit slower (2400 points against 3000 of the 3060ti), however boasts 12 GB of ram. I would gladly exchange some speed for extra ram as the card would have a longer service life before running into memory limitations.

As for the 6700XT, I would expect it to match up with the 3060 rather than 3060ti, after the “ray tracing” features are enabled.

If the 6900XT is a direct competitor to 3090, then even after removing RT out from the equation the 3090 have more performance - if we look at CUDA in Blender.

Here is OpenData benchmark CUDA vs HIP on Windors:

Regular 3090 (non Ti) scores almost 3500 points, while 6900XT on HIP only ~2170. That’s over 1300 points of difference without even mentioning hardware raytracing. Similar gap exists between 3080 and 6800XT. By that measure AMD lineup should be moved one model down to be competitive. Again - I’m talking raw CUDA vs HIP performance in Blender not about gaming. And I think is fair comparison from artist point of view.

I’m jaded with hardware launches in general. I’ll consider new AMD GPU only if it have day one support for HIP/HIP-RT (explicitly said, verified, tested and benchmarked) and is not a terrible deal price-performance wise. I’m not going to buy a GPU that MIGHT get some features in the future.


I think one should look at the price rather than the number nomenclature on the box and in this regard AMD is quite competitive. The 6900XT matches up quite well with the 3080 price-performance wise and the 3090 is at least 50% more expensive. Same for 6800XT and the likes of 3070.

What is more, after the introduction of HIP-RT, AMD should manage to diminish the advantage of Optix and offer quite a bit more memory than their price matched rivals from Nvidia. That may just be worth the wait.

Otherwise, fair point, buy only when you know what you are buying into.

1 Like

Ahhhh, didn’t checked the prices for a long time because of the shenanigans around MSRPs. You might be right about this one. So yeah. Price wise AMD does not look bad, and more VRAM is always very welcome.

It’s also interesting that W6800 is above the 6800XT in benchmarks. If that will be a case with HIP-RT, then finally there will be a reason to buy Radeon PRO. If I remember correctly PRO cards lagged a little behind gaming versions based on the same die when it comes to performance. Also on the PRO note I still can’t understand why AMD didn’t released W6900 with 32GB of VRAM.

And yeah. I’m waiting for raytracing support. Without it - no buy.

1 Like

Regarding the PRO cards against the gaming versions, I think there may be some issues going on with the benchmarks. Your link shows 6800XT being nearly half the speed of the 6900XT. However, If searched individually without specifying the OS, 6800XT scores 1900points, while 6900XT a little over 2000.

Under the same search criteria, the WX 6800 gets a score of 1476 and a gaming 6800 - 1416. More or less a toss between them, may even be down to cooling situation.

It appears that if Windows is specified in the search criteria for the 6800XT, then the score goes down by some 600 points. Maybe early HIP test runs are dragging the average score down?

1 Like

Benchmarks or drivers. That’s my biggest concern when it comes to AMD GPUs, and why I’m not very hyped about next hardware release.

On one hand the HIP in Blender is very new thing and some optimizations might land in the future. But on the other ROCm was released in 2016. That is 6 years ago.

1 Like

You have a point there, the drivers can be a hit or miss. That being said, my personal experience with the 6800XT has been very solid, although I bought approximately a year after the launch. No issues to report yet. That is a significant improvement since the days of Vega. Maybe all that money from Zen is finally making a difference.

1 Like