The best news about this probably is the fact that the rumors of these cards being GCN are false, this is a new architecture known as RDNA, or Radeon DNA.
Why that is good, the redesign means a noticeably higher performance per-clock and a significant reduction in power consumption per-clock. In addition, this has more cache and will come with industry-proven GDDR6 memory rather than HBM.
The card shown in the presentation is designed to go up against the GTX 2070, with AMD claiming a 10 percent performance advantage.
I have to say, the confirmed information means Navi is better than I feared it could’ve been, they could have a winner if only we see an equivalent bump in their driver quality. At the least, there’s a chance this could drive GPU prices down to more reasonable levels
Unfortunately in my case after being a loyal amd customer since day 1 i recently switch to the green crook since they are the only one that get supported in every 3d applications.
And i know for a fact that amd will never get to this level of support since i have wait almost 10 years and they never did it.
That would be an issue to dampen’s Navi’s release for sure, but they’re not crap to the point that the Blender devs. can’t make them work for the most part, and they tend to be decent enough for Godot to work too. Plus, most of the money to be made is among the millions of people who primarily use their PC for games.
Now the disaster might come if Intel unveils their dedicated GPU models and their drivers are as broken as the ones they release for their integrated graphics (which I can see as a possibility as they have not seemed to improve much under Raja).
With industry veterans replacing Raja in AMD’s GPU division, it will depend on whether they order a much needed shot in the arm for drivers. That said, the situation as of now requires one to get an Nvidia card if you need some new graphics or rendering feature to work now instead of later.
Let’s hope they can make it since we all see here how much nvidia was able to jack their price so high that even entry level card cost an arm and a leg and keep in mind the ongoing trade war with China that will certainly make video cards way more expansive very soon.
Vega 64 ~= GTX 1080 which is slightly slower then RTX 2070.
Radeon VII >= RTX 2070
So the proposed RX 5700 which is slightly faster then RTX 2070 (in AMD specific game), would end up on average = RTX 2070.
In the end RX 5700 on par or slightly behind Radeon VII?
Either way eager to see June 10th E3 talk to get more insight how the RDNA differs from GCN architecture. And hopefully some more performance numbers/estimates…
Wonder also how that translates to Rendering power… Though OpenCL in Blender still needs further refinments…
Targeting the mid-range and with a new RDNA architecture, this is a significant step up from Polaris with less wattage and less beefy cooling for the reference cards. AMD claims it can push a noticeably higher amount of frames than the RTX 2070.
The price though shows that GPU’s are likely going to remain more expensive moving forward, the flagship is under 500 dollars and the lower model is under 400, but Nvidia has nothing seriously undercutting this yet and mid-range is still pricier than it used to be.
These aren’t the heavy hitting cards AMD slates for release next year, but they overall have a better price/performance ratio than’s Nvidia’s “Super” lineup of cards and can at least beat the 2060 Super and the original 2070.
In a stunning upset, it was found that the wattage needed by these cards, combined with the technology and software they have, is actually lower than what is needed by Nvidia Turing (at the same performance levels). This has not happened in many years.
In a smaller achievement for AMD, they finally managed to produce powerful GPU models that don’t sound like a jet engine, though they might get quieter yet if non-blower style models come out from AIB vendors.
The bad
Nvidia still holds the performance crown overall and the Nvidia 2070 Super beats both cards (it remains to be seen if drivers improve that over time).
They can still run hot, but that might just be due to the blower design AMD likes to use for the vanilla models.
They don’t have RTX or variable-rate shading, even though they have their own tricks. AMD’s reasoning is realtime raytracing needs to improve and be in more games first so buyers can instantly make use of it.
Open questions
It remains to be seen if AMD will follow up with major improvements to their drivers, so Nvidia stops being the only real choice for DCC app. users.
Uh oh, it looks like AMD’s new cards don’t even work if you try to use them with Blender 2.8 (for now anyway). https://developer.blender.org/T66848
AMD’s proud tradition of rocky GPU launches lives on, so hold on to those Nvidia cards until the dust settles. If anything, AMD might be back in the game with hardware, but it won’t matter if the software is bad.
I don’t know what “compute shaders” are - you sound like a graphics programmer, I’d like to ask you: for gfx programmers, do you have to pick one or the other (Nvidia or AMD), ie. are they substantially different, if you wanna make games or whatever? If you pick one, you have to stick with it, From Then On…?