Nvidia announced GTX 1080 and GTX 1070

Sorry it’s not super exciting learning. if you paid attention to anything but the last paragraph, you would have learned something that enables you to make better decisions.

But if you want to continue your ignorance in assuming that bigger number = faster card, go right ahead. Then come back here and pout about “omg, why my card suck, it has soooo many cuda cores!?!?”

Netroxen was trying to correct your incorrect assumptions. If you want to fight back against what he said, you are defending your own ignorance. Do you want to continue to be as ignorant as possible, or do you wanna grow up and learn something?

The 1070 is supposed to be faster than the TitanX. 1080 is supposed to beat a 2x980sli by a notable margin but not quite a 2x980ti sli.
Now we wont know for sure until the benchmarks are released, but the fact is the architecture is superior and though there might be less cuda than a titanX, you’re still getting the stronger architecture at insane clock speeds while using less power.

Less Vram than a titan is hardly an argument, it’s rare for people to really need THAT much, and when the RAM is much much faster, it helps as far as how much effective ram you need though the threshold will still ultimately be the limit.

Precisely how does my decision making effect you in anyway that you can stick your nose into my business as well? When it’s you buying my gpus for me, then you have a right to complain as well. Until then, piss off.

maybe you want to revise your choice of language here in this forum as a user.

Keep on fighting for your right to be the worst person you can possibly be. Don’t ever change.

I believe that’s only in VR where each card renders a frame for each eye seperately. 1080 should be about 15-20% faster than 980ti on release in all non-VR scenarious.

www.kappit.com/img/pics/201503_1020_iceia_sm.jpg

Literally just came on and checked this thread, absolutely hilarious. You made a post which was factually wrong, so I corrected it. In return you start posting about how it’s none of our business how much you payed (completely irrelevant) and how we should keep our noses out of things.

So as I said, seems like you’re a little butt-hurt - amirite?

So moving on…

Moderation - Folks, you’re going to have to do a little bit better job of maintaining civility. I’ve removed a few posts from the end of this thread that were off-topic and direct attacks. You’re welcome to continue berating one another via PM… but this thread is not the place for it.

I’ve only tested VR on my smartphone with Google cardboard and it wasn’t that hot to behold. 1. after about 5 mins I felt sick and 2. the screen resolution was too low to be enjoyable. I wonder if someone here has tried the latest headsets and found them satisfactory? I am thinking that it would be better to wait until the resolution is higher i.e. perhaps skip this new generation of cards that, as fast as they are, may not have enough performance in the future. VR seems expensive enough to get into without the hardware becoming outdated quickly…

Check it out http://videocardz.com/59871/nvidia-geforce-gtx-1080-3dmark-firestrike-and-3dmark11-performance

Attachments


Hi, these are gaming and directX benchmarks, not really helpful for Cuda performance.
We have to wait for a Cycles or other Cuda engine test.
There is also a lot of fine tuning needed to get the best performance out of the latest cards.
Jucyfruit wrote he had pre ordered a 1080, so Cycles is prepared for the new generation. :slight_smile:

Cheers, mib

Still its give idea how powerful it may be compered to maxwellss.
Any news about possibility of implementation sm_6.0 in current blender rev. ?

Attachments


Like it is any realistic to have a phone at full throttle with unrealistic workloads until the battery is depleted. It could be a variety of factors that made the Samsung chip perform worse. That if we take as any good 3 frigging tests which I refuse as sampling is not enough by any means.

I bet it is really hard keeping track of each battle while fighting against everyone here.

Context. AMD isn’t selling the console, it is a parts supplier and doesn’t care a single bit if Sony or MS are losing money per unit.

Do you realize that those graphs include their R&D and the beyond awful Tegra losses? That’s why I looked for specific graphs. Intel is famous for having over 65% margins and those don’t reflect on their overall results according to your accounting.

You’re the one nitpicking to prove your argument. Please get real.

The point still stands. Their Tesla line is a high margin one even taking whatever deal they make for bulk sales. Something that AMD can’t even dream of.

And after this, welcome to my ignore list. No wonder half the population of these forums already placed you in theirs.

Looks like you embraced winning every single argument as some kind of sport and that’s not even remotely healthy.

Wow, liquid cooled GTX 1080 @ 2.5Ghz is in the pipeline http://wccftech.com/geforce-gtx-1080-blow-p100-gpu-single-precision-performance-variants-including-25-ghz-liquid-cooled-edition/

Guys come on! Beerbaron and I also had an argument in the past - so what. We are cool now and he is not on my ignore list.

Eh eh… sorry I have to be cereal.
Actually It’s healty. Everyone needs to balance a frustrating life in some way. That’s how we keep insurgents cases of psychopathy under a safe level. Internet fights and BDSM.

Except this is a GPU thread not a CPU thread, and AMD do decently in the laptop space, and fantastically in the consoles space, so they’re not in any jeopardy…

…not to mention these cards are cheaper because of the competition brought by AMD in their latest line, the 380X and Fury’s beat out nVidia in gaming benches (especially in Vulkan/DX12) and gave them a scare…

…platform loyalty is always so sad to see. nVidia don’t care about you lad.

A bit of common sense from RedShift dev.

Hello everyone,

There are already a few threads discussing NVidia’s soon-to-be-launched GPUs, so we wanted to offer a few thoughts and recommendations on the subject.

First of all, you should be aware that Redshift hasn’t yet been ported to run on these GPUs. This is due to two reasons: First, the CUDA SDK is not ready for them yet so we can’t even compile Redshift for these GPUs. Secondly, we haven’t received dev boards from NVidia and it’s quite likely we won’t, as we typically get sent Teslas/Quadros. This means we’ll have to wait and purchase them from the shop just like everybody else. This is a bit unfortunate but not too surprising: a similar thing happened with the launch of the GTX970/980 a couple of years ago.

For this reason, we strongly recommend fighting the urge to purchase these new GPUs at this stage. If you ignore this and decide to buy them anyway, not only will you have to wait several days until Redshift gets developed/tested on these boards but you’ll also very likely face potential performance and stability issues due to all sorts of factors such as the CUDA compiler and drivers not being 100% ready. A similar thing happened around the launch of the GTX970/980 some time ago. As most of you know by now, GPU rendering in not high on NVidia’s priority list as far as gaming-grade GPUs are concerned. Videogames are. So that’s where they’ll be spending the driver/stability/CUDA resources initially.

Having said all of the above, the performance of these new GPUs looks great (at least on paper) and, combined with their low price, means they could be excellent for GPU rendering.

But let’s first wait and see! https://www.redshift3d.com/images/smileys/smile.gif

-Panos

- source -

BTW - last i checked GTX980Ti (in a case also TitanX) is still a ‘wonder’ for Cycles on W10

And first results are coming…

Super cool :slight_smile:

Can someone explain what Unified Memory in these cards will bring for CGI 3D rendering? Will this bring some benefits for us, ideally? Out of core technology? No more Vram limit at the cost of some slowdown?

BeerBaron I need you