Intel unveils Broadwell; Which may mean AMD losing the crown in APU's too

There is absolutely nothing that AMD has by now that is ahead of either Nvidia or Intel, and now even their ATI GPU’s are getting beaten by Intel’s integrated solutions.

Rumors are already swirling that AMD will be out of business within the next year or so as Intel races onward in its quest to someday even beat Nvidia and take the GPU crown as well (they already had to sell property to raise money).

With Skylake coming out in a few months, AMD may end up consigned to the dustbin of history, forever.

Overall though, with the current information that is out there, does anyone think that Intel can make a bigger leap with Skylake than they did with Haswell?

AMD is doing fine as far as components go (video game consoles for example), but yeah this was to be expected. Intel always has something of their sleeve to stay ahead. I wouldnt be surprised if this is what pushes AMD to sell itself to Samsung which has made an offer.

I’m fine as I already have a 5820k, but ill probably make another rig for the gf with that hardware. Good times!

I did not understand that. Are you refering to dedicated ATI graphics cards? Where I can find a performance comparison between intel APU’s and dedicated ATI cards?

I think AMD CPU/APU has its place in the market with all those who have no money to buy intel. Around the world, it is a big market. So I hope that AMD will never come to bankruptcy. It would not be good for the cost of hardware having only an Intel/Nvidia monopoly.
That regarding CPU and APU. Regarding dedicated graphics cards, I’ve seen the results with the new OpenCL kernel split in Blender, and ATI is pretty good.

Here’s what AMD is hoping to counter Intel with

If they can last until their Zen chips are finished, then they might finally have something that can once again compete with Intel’s offerings (which should have a performance equivalent to the i7’s if they work as advertised).

I think HP may be on its way to circumnavigating gpu tech… with photonics.

if nvidia and intel are still playing with electrons, and HP comes out with a fully photonic computer… we will have a new king.

"Machine will push the boundaries of the physics behind IT by leveraging electrons for computation, photons for communication, and ions for storage, and this new computer architecture could potentially be 6x faster while requiring 80x less power, according to Fink.

The company plans to have the first prototype of Machine ready by the end of 2016. More information about Machine is available here.

Read more:
Follow us: @sdtimes on Twitter | sdtimes on Facebook"

That’s a pretty interesting read there, BPR. Thanks.

Oh man i hope Intel steps it up with a GPU that can compete with AMD and Nvidia’s. They’re the only manufacturer to provide open source gpu drivers and using them on Linux is great.

Nice thread title, William Randolph Hearst

This isn’t buzzfeed, you don’t need to hook in readers with a catchy title. It makes reading the forums worse when you post things like this. This is probably the 10th time someone has said this to you. I’m not an admin, I’m just a citizen of the forums. But could you please consider your tone when posting? It would be much appreciated if you did.

It is years that AMD is on life support :wink:

When you look at the facts, AMD has even been beaten in the budget computing and budget GPU department by Intel’s new Pentiums and Nvidia’s Maxwell chips. This was the last area that AMD was actually ahead in and they now face the unprecedented idea that there may be no place for them to even exist anymore.

If the sales of the various game consoles start slowing down and if they can’t pull a massive rabbit out of their hat soon, they’re either finished or they get bought out by someone who may steer them away from supporting hardware for PC’s.

I’m currently pondering the idea if I should replace the AMD card that came with my machine with an Nvidia GPU (preferably a GTX960 as I’m not a huge gamer). I’m really not sure how much longer it will even be supported due to their troubles.

Those CPUs offer about twice the CPU performance and less than twice the GPU performance, compared to the A10-7850K. They also cost at least twice as much. They are so much faster (and more expensive) because of embedded DRAM. Intel has been offering similar chips for BGA (soldered) for a while, so it’s not big news. The fact that they waited so long to offer these on the Desktop tells me it’s not an important market for them. Whatever (if anything) is going to bankrupt AMD, these chips won’t be it.

What a lot of hyperbole… . I was suspecting something surprising but in the end it is all about APU’s with isn’t that powerful to begin with. Wake me up when Intels GPU are as fast as even the bottom range of AMD/Nvidia dedicated cards… .

It is also comparing new technology with technology that is reaching its EOL. AMD Carrizo architecture is coming out at the end of this month or in early July.

And I say that as an Intel enthusiast (5x intel devices at home: 1 atom tablet, 2 nucs, 1 i5 laptop and i7 desktop) that doesn’t have any AMD GPU’s in service at this moment.

And in what parallel universe will sales of consoles slow down soon? The popular one is still waiting for it first real price drop… .

edit: did you edit the title because the first one was very hyperbolic imho


That happened in 2013 with the first Iris Pro graphics, they just weren’t available for desktops:

Sure beating a GT640 isn’t that exciting :wink: but it’s the low end NVidia dedicated card

The current iGPU’s are now faster than all current APU’s, and I don’t see any refresh’s AMD makes can really change that, as to be faster they need to add more GPU cores, but they already have power issues.

No problem,
if you look in the off topic chat, I thought about the same thing about 3 or 4 years ago.

here is a functioning optical proccessor… nasa readiness level 4
(working protoype)

just found this, they are going ahead, without the memzistor tech,

making the whole thing a power hog, however some of this could be negated by
powering cpu/arm and dram with beamed power via a scanning maser, and graphene nanocapacitors.
(dropping resistance of wires and improving scale ability)

This futuristic machine is not going to be the type of thing that you’re going to use as a home PC. The article specifically states that its main use is for massively parallel applications (which disqualifies 3D software) and data centers. I don’t see it being possible for you to use the BGE on that thing, sorry.

That alone is a major difference compared to Intel’s new chip developments.

you mean like real time path tracing?

we did not start with cell phones…

If you use it only for raytracing, perhaps, and only if the individual cores work similar enough to silicone-based ones to not require a total rewrite of the whole application.

Using Blender on it though, it follows the same assumption that the devs. will not have to rewrite most of Blender and also assumes that single core computing power is enough for the intensive functions that can’t be threaded.

Memristers meanwhile are probably not going to give you a real big computing advantage until application developers start writing code to take advantage of their unique approach to problem solving, that is unless they find some way to compile existing code in a way that can make good use of them but that might be a ways off yet.

Not to mention that it will likely be extremely expensive for the first 5-10 years or so, it took more than 20 years from the development of transistors to finally make home computing affordable.

well, the prototype release is next year,
we will seewhat Moore’s law has to say about it after that.

From what I read on that 2nd link it looks like an “average” computer will have the equivalent speed of the 5th fastest super computer of today. Photonics for the win!.. until the quantum computer steps in, of course!