Nvidia announced GTX 1080 and GTX 1070

Some infos :

The GTX 1080 claims to be 20% faster than the TitanX/980TI and to consume only 180W (against 250W for TitanX).
The price would be roughly the half.

The GTX 1070 should be also faster than the TitanX for only $379.



Can we assume that these cards will “just work” with Blender or do we need to wait for some tests?

Well, I will not assume that, especially with the current huge performance drop using Win10/TitanX-980TI/blender 2.77

These cards are likely sm_60 and they will not just work out of the box. I have just pre-ordered them. Ill check if we can release a 2.77a kernel that you just add to blender and that will work or if you might need 2.77b / 2.78.

I’d heard about the Win10/TitanX-980TI performance problem and that’s what I was wondering about. For myself I’m in no rush and can wait. I’m glad that when I built my new system last Fall that I went with a 960; I won’t feel so bad getting rid of it so soon.

Does the Titan and GTX 980 Ti work correctly on Linux?

These cards look great for sure. I’m probably getting a 1070 and then a 1080 Ti whenever that’s released. Whether these cards will work better with Blender remains to be seen but I’m doing game stuff anyway.

Yes linko, for me the Titan X is working without any speed issue on Linux :slight_smile:

I can hardly wait 3 more weeks, but at the same time I also a bit skeptic about the speed claims. Twice as fast at half of the price…too good to be true? I smell marketing spins. The rumors have it that speed claim is referring to VR performance. I’ll wait for all the bench tests from indies.

I tried to find any reliable information about the CUDA cores.

I am more interested in that performance aspect than gaming scores.

It seems that the 1070 has the same numbres of cuda cores than a 980. We have to wait benchmark

8GB VRAM for $700, with comparable performance to Titan X? Hmm, if there are not some serious issues waiting to pop up, sign me up, I can try and be parsimonious and keep my scenes under 8GB… Now, I am also more interested in CUDA performance, but this seems like a good deal on the surface. Supposedly this architecture has v. good FP32 performance.

If I have to install Linux to use it with Blender, that’s fine by me. I’ve been meaning to learn how to use Linux anyway, lol

Anyone know when the 1080 Ti is meant to come out?

No one knows. Probably sometime between late this year and early next year. Or maybe even later if it’s really using HMB2 memory.

The 1080 Founder Edition is an overclocked version, should be in between.

It’s not a rumor, it’s literally what they said at the presentation: Twice the performance (of a Titan X) for VR, because of the new “simultaneous multi-projection” feature.

Otherwise, NVIDIA claimed twice the performance of a GTX980 (~450$ right now) for the GTX1080 ($599 MSRP). Both the GTX1080 and the GTX1070 (380$ MSRP) are supposedly faster (but not by 2x) than the 980Ti (~550$) and the Titan X (~1100$).

Of course, those are vendor-selected benchmarks and it’ll always depend on the workload. Memory bandwidth in particular hasn’t really improved over the 980Ti/Titan X, since the memory interface is narrower (at a higher clock).

Full specs are on the NVIDIA site, which shows GEFORCE GTX 1080 - NVIDIA CUDA Cores 2560

Quick search on sm_60 shows this apparent Pascal Cycles failure by the NVIDIA team earlier this year:
It fails completely rather than just performing badly.

What has been the average time gap between NVIDIA hardware/software releases and support for the new GPU in our beloved Blender?

I wonder if Nvidia support the Blender Foundation at all by donating cards for them to test. I imagine it is in Nvidia’s interest for developers of all popular software to be able to thoroughly test and make sure their program works well with Nvidia’s cards.

Anyway, I’m pretty pumped for this card, even though I only upgraded to a 970 a few months ago :confused: Maybe I can sell it and get at least half my money back to put towards a new card. I’m mostly interested because of the greater power efficiency and VR performance - I don’t really do any offline rendering.