Next-Gen GPU

How would this card even work with water cooling then, couldn’t the card then get hot enough to boil the water going through the tubes?

Ethylene glycol (common antifreeze) boils at 200 °C. Usually it’s mixed with water and under pressure. Water in 2 atm pressure boils at 120 °C, with 66% EG the boiling point is 130 °C.

1 Like

The crashing reports are starting to make the rounds on technology sites.

Judging from all of the techno jargon, the cause of the CTD issues are complex enough to ensure that they won’t be fixable with driver updates. At first glance, it seems like the issues are from AiB vendors cutting corners and/or overclocking the cards, but then again Nvidia has had many architectures where the factory overclocked models worked just fine for years on end.

AMD may not even need the best performing product to make serious headway, they simply need to ensure the early adopters get cards that just work without major issues.

3 Likes

With the GDDR 6x memory using a quadernary instead on the usual binary system it makes it so the video card can’t be treated or overclocked like previous generations. Board partners are treating it like it is a same old same old.

AMD crushed Intel. I don’t see why they couldn’t use the same tech to crush Nvidia. On top of that all they really need are better drivers which they may or may not already have. If I can render faster with AMD for less watts per frame I’m going AMD.

One thing for sure is Nvidia sure is worried. Nvidia has really good spies. Using their spy data they decided to rushed the launch of the 30series with a last minute boost in its default speed, massively reduced the price over the last series, and set it for sale before they had a reasonable stock all so they could beat AMD to market. Big Navi 60 series, at least from a gaming perspective, is probably going to crush the 30 series in every possible way.

1 Like

Not so long ago, I wouldn’t believe that AMD is able to create a GPU at RTX 3080 level, however Nvidia’s nervousness may indicate that something’s in the air.
Do not forget there are two big players coming to the GPU market: Intel and Apple.

With intel graphics it is funny because back in August we got this

Then in September we got


Saying it was only for the moblie market. Their on board CPUs are impressive, but they might be holding off on the GPU side because they don’t feel they can make something better than AMD or Nvidia. They can’t make a better CPU than AMD. I don’t see why they could make a better GPU.

As for Apple going all in on ARM their next desktop looks like it is powered by a tablet chip. Not much innovation there. If you are making a desktop make a new chip that takes 120 watts and completely destroys your mobile processor. I could not find number on how many watts the new chip takes, or how much previous gens of apple tablets take, but it seems like it should be in the 3 watt range. Talk about a horrid POS. More interesting is what Nvidia will do with ARM now they own it. Maybe the 40 series will be all ARM with 1/10th the power and 2x the performance of the 30 series. Everyone will say why not make it same power with 20 times the power and Nvidia will say because we need more money by releasing only small updates even when we make a huge breakthrough. I kind of doubt ARM is that good though.

Actually Intel make better CPU (10nm 1-core performance) than AMD but so far it has nothing to do with the HEDT market.

Apple’s tablet chips will power tablets and laptops and not desktops.

I believe that you wrongly identify Apple GPU with ARM cores. Apple GPU cores are Apple’s own design and has nothing to do with ARM and Nvidia.

Apple A13 TDP is 6W. I didn’t found TDP rate just for A13 GPU section and A14 results.

Octane Bench
RTX 3080: OB 564, TDP 320 W, OB/TDP=1.8
RTX 2080Ti: OB 356, TDP 250 W, OB/TDP=1.4
A13 (iPhone): OB 15, TDP <6W, OB/TDP>2.5

1 Like

Apple are going to use custom desktop chips not tablet chips. There’s a huge amount of innovation going on and I say that as no friend of Apple. Their Tile Based Rendering approach where the screen is split into tiles and shared between the available GPU cores absolutely screams that they’re not going to rely upon a single SOC in Macs but use multiple. You could see higher end Macs with 4 or 8 SOCs delivering enormous performance from relatively cheap and tiny processors all linked together. The Metal API is already in place and works across multi-GPUs.

The era of Apple Silicon is all about Heterogeneous computing and minimising inefficiencies caused by the CPU and GPU having separate memory pools. Apple Silicon Macs will remove huge architectural inefficiencies messaging across the PCIe bus which means being able to do much more with lower end hardware. The performance uplift will be impressive.

@anon80315389
That’s a pretty myopic view of Intel vs AMD. AMD chose a different design philosophy to Intel and went with high core counts when Intel was content to give customers 4 core CPUs year after year. Each iteration of Ryzen has come with improvements to IPC and clock speed and come the release of Zen 3 you’ll see AMD stomp all over Intel’s single core performance but with the advantage of much higher core counts.

An AMD official has said Intel is going to require “pain pills” when Zen 3 is released. They’re obviously confident.

AMD went down the road of chiplets, small and cheap to produce and when combined together they make an impressive CPU. I think Apple is doing a similar thing with Apple Silicon not using chiplets but combining their small desktop SOCs together to form high core counts of CPU and GPU cores that will form a highly efficient heterogeneous system that will probably change computing as we know it.

The days of absurdly power hungry discrete GPUs are numbered.

This info was ripped from a MacOS beta,

And this, if believed, suggests the Biggest Navi will compete with the 3090.

It is nice to learn new things.
I had read " Using the same Apple A12Z Bionic chip you’ll find [in an $800 iPad Pro], the company showed that a low-power ARM desktop can already handle a variety of power user apps on Mac, including:"
and not the whole article. It seems Apple is dipping its toe in the gpu market. Making a low power gpu is much different than making a high power one. I wonder how it will scale. Nvidia tried tiles and failed. Apple’s software engineers are quite good though.

The OB/TDP is very interesting. With 1.8 OB/TDP compared to the iPhone at 2.5 it means iPhone is only
38.9% more efficient. It seems AMD is going for 50% more efficient per release so the video card release of the 70 series should beat the apple gpu. Quite impressive on the AMD side.

It will be quite fun seeing them duke it out.

Let’s play this game for fun. You can also interpret it that way:

  1. Nvidia jump from 20x0 generation to 30x0 is 0.4 OB/TDP so Apple with its one year old GPU is almost two generation ahead of Nvidia newest chips :slight_smile:
  2. TDP=6W is for A13. Let’s say 5nm process will give 30% better efficiency. We have TDP=4.2W for the same performance and OB/TDP=3.6. So Ax at this point is 2x more efficient than Nvidia 30x0 :slight_smile: :slight_smile:
  3. TDP for Ax was given for the whole chip. Let’s assume 70% for GPU. This gives us OB/TDP=4.7 for Apple Ax and 2,6 times better efficiency than Nvidia 30x0 and that means Apple 5nm GPU can have TDP=135W to compete with RTX3080 :slight_smile: :slight_smile: :slight_smile:

It should be noted that the competition is trying to fill all the enthusiasm on the nvidia cards of these weeks with explosive mines …
Does this make sense.
We are in the midst of a no holds barred war out there … :wink:

nVidia have managed to kill the enthusiasm all by themselves and have need precious little help from AMD.

That memory capability and TDP of the 6900 XT look really nice, entirely depending on how well AMD can come up with compute this time. If they have something to compete with CUDA for rendering and deep learning, it could potentially be really nice.

To be precise - not Nvidia but ‘Nvidia partners’ - assholes who wanted to save a few bucks on capacitors in graphic cards for hundreds of dollars

Everything you need to know about the 30 series reported problems with some AIBs.

That wasn’t what I was meaning when I said nVidia had killed the enthusiasm.

Precisely then,

nVidia chose a cheap half node improvement instead of a comparable 7nm node which has caused power consumption to rocket and ensure some of the biggest GPUs ever made are being produced.

10GB on their ‘flagship GPU’ - Jensen’s words not mine. 10GB is designed to be too low for future 4k gaming so it pushes the enthusiast level gamer towards a GPU that’s over 2x the price and only 5-10 faster.

Ridiculous power requirements for the 3090 mean dual GPUs will have to be watercooled and be £2000/ea. That’s a long way from the MSRP. Remember when the MSRP was always the higher price and street price was always cheaper. Blower version of these cards will sound like hair-dryers.

The launch that was not a launch, it appears not much more than review cards were able to be manufactured especially in regards to the 3090 numbers.

None of this has anything to do with board partners, the enthusiasm has been hoovered up by nVidia and nVidia alone. The crashes are just the cherry on top of a rather crusty shit.

maybe you are right maybe not, it will be half and half, however and it is my feeling, I do not remember such an aggressive campaign on hardware defects, and I think this: 3 - 4 gpu not working well are enough to raise media fuss :grin:

You think it’s only 3-4 GPUs that are crashing?

Is that what you seriously believe?

No, I said that they are enough to raise media fuss, it’s different

I don’t think nvidia is that stupid to send gpu for reviews that are not working well.

I don’t even think there are too many of these gpus in circulation

The media fuss has been caused by nVidia with a non-launch and tiny levels of available stock and no date for restocks. It’s also been caused by nVidia’s own 8K gaming marketing hyperbole which has been shown to be bullshit. The tech channels have been served absolute gifts by nVidia to fire back with interest.

Why is it a problem the media are calling out nVidia, isn’t that what an independent media is supposed to do? Are you insinuating the crash situation is contrived now? Personally, I’m pleased to see many channels taking nVidia to task over their bullshit marketing claims as the way Youtube works it’s very often financially beneficial to shut up and make a promo video dressed up as a review.

Bizarrely AMD haven’t sought to capitalise on nVidia’s screw up, they have remained silent as far as I can see and only replied to nVidia fanboys directly who have tried to claim AMD will do a similar paper launch.

If I was AMD I’d be piling on the pressure making sure everyone and their cat knows they will be able to buy powerful new AMD GPUs at or around the launch day.