Nvidia announced GTX 1080 and GTX 1070

Beerbaron

both tue - just less power consumption already is nice for people who don’t want to have a heater next to their feet :wink:

That sounds a bit overly dramatic - supporting Pascal GPUs would be a matter of ~5min of changing settings, installing a new CUDA toolkit and doimg a recompile - it’s not even sure yet whether we need to recompile at all or can just share a new .cubin file.

Here is a pc build under 900 euros including a GTX 1070: (takken from hardware.fr on LDLC http://www.ldlc.com/fiche/PB00132336.html )

  • Intel Core i5-6500 - Quad Core (3.2 GHz, Turbo 3.6 GHz)
  • Motherboard Gigabyte GA-B150M-D3H (Intel B150 Express)
  • GeForce GTX 1070
  • Kingston 8 Gb (kit 2x 4 Gb) DDR4-2133
  • Seagate 7200.14 - 1 Tb 7200 RPM 64 Mb Serial ATA III
  • Zalman R1
  • Corsair VS450

When the 980 Ti issue popped in the dev.blender site Admiral Potato kindly donated a GTX 980 Ti to the Blender Institute so they could test it. So Nvidia not helping at all isn’t a wild guess.

https://developer.blender.org/T45093
On topic, this new GTX 1070 may finally put my old and trusty GTX 580 3 GB out of business.

I don’t know anything about these APIs but does Vulkan change anything? I.e., can it be used with or instead of CUDA?

Vulcan would focus on openCL and not work with CUDA.

There is limited interop for CUDA/Vulkan, just like there is interop for OpenGL and CUDA. It’s all possible through vendor extensions.

However, Cycles doesn’t benefit from such interop. You could make Cycles run on top of Vulkan, but you’d need a compiler for a C-like language to SPIR-V that also works in a Vulkan (compute shader) context. Right now, only GLSL has a proper reference implementation to target Vulkan, which is too different for the Cycles codebase to work with.

To give an actual answer, Vulkan doesn’t matter for Cycles. On AMD hardware, there will be OpenCL and on NVIDIA hardware there will be CUDA.

Rip AMD. Nvidia just rekt you.

Nvidia fans have been saying that for years, but AMD is still doing pretty decent in the GPU department.

Before writing them off completely, keep in mind that they have not yet announced this year’s chip lineup (which is supposedly going to be in June). Also, Nvidia’s boastful claims may end up being muted a bit if the rumors are true about the declining quality of their drivers (which have allegedly lost their rock-solid reputation). AMD is also now on the same 14nm fab process as Nvidia is (so transistor densities will be more or less equal).

Also, such advances as this is a major reason why I think doing SLI in most cases is a complete waste of money, you spend twice as much and soon enough there’s a new single chip that has the same amount of processing power (and costs less).

This is a reason for writing them off. They have said they’re not competing against the high-end 1080 at all. They haven’t announced anything yet even though most people thought Polaris was releasing first. If AMD are releasing a mid-rangey card they are likely competing with the 1070 and 1060Ti (whenever that’s released) but they may lose some of their market to people who buy the 1070 instead since it’s pretty cheap for the performance.

That being said the markets are still pretty different which is a reason why AMD are doing good. For Nvidia it’s all about the single GPU in most cases, although I did hear about a machine with three 1080 for VR: one for each eye and the last one for Physx stuff (Flex and so on). This seems a bit overkill. :stuck_out_tongue: With AMD you’ve got crossfire which can be more useful since the cards are cheaper for one and you can mix GPUs.

Market share says exactly the same and the balance is leaning towards Nvidia more and more every year.

80% Nvidia - AMD 20%

And AMD is pretty much non existent in the HPC and pro segments which are the high profit ones. So they’re selling low quantities of hardware at cost while Nvidia is making loads of money with almost the whole consumer market and both the HPC and pro markets. The gap between them is too big now and AMD lost sight of the point of no return long ago.

AMD may have a chance with specialized gaming APUs able to output decent Steam Machines but the high margin markets are hopelessly closed for them.

Last I read, there was an interview with the CEO earlier this year stating that AMD was working on trimming its product line and becoming a more focused company (with the remaining products being of a higher quality).

It may not be so much as being unable to compete as much as having a new goal to create quality hardware products for people who choose not to throw upwards of several hundred dollars (or even thousands of dollars) at their PC rigs every year. Either because they don’t have as much disposable cash or they want to only spend if they really need to as opposed to the pursuit of bragging rights.

Or course, people who didn’t read much of the interview had a decidedly more negative conclusion from it. I know that when it comes to gaming hardware that people can be pretty nasty towards those that disagree with their preferences (and fortunately we largely don’t see the same case here).

Anyway, what do the Nvidia fans think might happen if they end up getting the permanent monopoly some of them are wishing for, will they be basking in a gamer’s/artist’s nirvana or do you think they might try something like pull CUDA support altogether from the Geforce cards?

NVidia is on TSMC’s 16nm process while AMD is using the more advanced FinFet 14nm process. Just wanted to clarify that.

While on the high end this may or may not be true (we have to wait for numbers) in the mid range if the price point being thrown around is to be believed ($300) then AMD is in a good position to basically take over the mid range market with a cheaper more power efficient card with the same performance as a 980.

AMD is aiming for the midrange market and their goal is to bring VR and decent 4k gaming to the masses. So a fairly powerful card at midrange price points. Nvidia knows this so they announced their card early and did a paper release ( your probably won’t find the 1080 for quite some time since that card uses the newer gddr5x and they can only source it from one vendor, supplies will be constrained). Nvidia can lower the price on their cards (especially the 1070) while it will be hard for AMD to lower the price on their cards lower than $300.

AMD needs some other advantages. If Nvidia drops the price of the 1070 by 50 bucks it will be close in price and performance as AMD’s offering (until we get solid number this may or may not be true).

Things are going to get interesting in the midrange.

Right now AMD is more concerned with making moeny and scale rather than outperforming nvidia.

Can AMD keep up with the price, the performance and the energy consumption of the 1070?

I’m so glad I didn’t buy my gpu yet…

Who says Globalfoundries’ 14nm process is “more advanced” than TSMC’s 16nm process? Both are using FinFET. On the iPhone 6, the 16nm version of the A9 actually fared better than the 14nm version.

Also, both AMD and NVIDIA are dependent on the capacity of the fabs, so it doesn’t matter if some 14nm node is “more advanced” if the yields end up worse. It’s quite likely that AMD will use TSMC’s 16nm process, too.

@John

AMD will never die (until microprocessor technology gets overtaken by something superior)

it’s better that way though

No competition is bad thing

a really really bad thing…

Oh and I could swear you had a R7 250 a last year or so

What do you mean by “die”? AMD can go out of business just like any other company and there are signs that it may go bankrupt if business doesn’t improve. If that happens, it’s unclear what will become of its assets and the “AMD” brand. Who is going to continue a business that just isn’t profitable? Microsoft?

Noting how other predictions from ‘investment experts’ panned out, there’s always a decent change they may be wrong unless AMD does nothing regarding their direction (and last I’ve seen, they have been working on that).

For instance, so-called experts telling you to buy gold because the stock market is about to crash. They even go as far as to try to extrapolate a trend or draw conclusions from one to two day events and almost always end with egg on their face.