Now this is a Geforce Titan !

check this out. :slight_smile:

http://www.maximumpc.com/gpu_doubletake_nvidia_launches_dual-gpu_geforce_titan_z_2999

dont know why its so expensive…
imho it should be only 2x cost of a normal Titan(why the extra $1000?)
its using the same PCB and its sharing a heat sink and they even save on packing material
over 2 cards… so… it really should be a little cheaper than 2 Titans imho.

Apparently it uses some new technology like ‘dynamic power balancing’ to reduce performance bottlenecks as well as shared memory (so a little more sophisticated than simply two cards in SLI).

You look at that price though, that is three times the amount I paid for my current PC, so I don’t think we’ll be seeing a lot of people on this forum buying it (maybe except for the Ray Pump guy since he offers a Cycles rendering service).

but for the cost you could get 3x TiTan Blacks. (8640 Cores) vs (5760 cores)

Eh, as a person who builds his rigs to be cost effective this seems like an early adopter bragging rights trap of a price.

I’d love to see what kind of performance that thing would put out to a single 1080P 24" LCD monitor. Probably amazing FPS and decent 1080P render times if you optimize right.

For rendering purposes none of them make any sense. Nvidia finally allowed its partners to mount 6GB on the GTX 780 and the GTX 780 Ti. Any of these cards with an aftermarket cooler will tear this card apart both in performance and price/perf ratio.

i agree…
a 6GB 780Ti makes the most sense for Blender now.
(they say the 6GB model will be about $50 more than the 3GB model)

i would just say wait until the 8xx series come out.(unless you really need GPU power now.)

Well, when it comes to 5K at least, you’re pretty much going beyond the limits of what the human eye can see unless you’re using a 60-80 inch TV screen as a monitor that you’re a foot away from.

The eye in itself is thought to be able to see something down to a tenth of a millimeter small, but it’s larger than that looking at a consumer PC monitor because you’re not positioning your head less than an inch away, so why do you need the pixel size to be that much smaller if you’re not going to notice any additional fidelity to the display (much how the color race for games largely stopped at 32 bit because it displays far more color values than what your retinas can register)?

1.) There are no 6GB TIs - for obvious reasons. No only regular 780s announced.
2.) This card clocks at around 700 MHZ which makes it only about 50% faster than a regular Black.
3.) LOL @ nVidia´s prices.

wait for GTX 880 it will be worth it… trust me

Does anyone one the forum have a dual or quad GPU setup? I’m thinking ahead to when I next rebuild my computer “if I have a job”. At the moment I have a single 670 GPU, its slow and freezes badly while rendering “I have implemented the reg hack to stop the GPU timing out”.

I was thinking of getting 4 GPU’s for my next rig. I was wondering what the best price to power build was. Am I better getting two top spec GPUs or 4 medium spec GPUs. I suppose that this is further complicated by the fact that I want to continue to use the PC while its rendering sometimes so it might be worth getting 4 medium GPUs that way I only lose 1/4 of the power instead of 1/2 while working.

I know I shouldn’t work while rendering but reality.

780 and 780ti’s with 6GB

You’re wrong.

http://www.overclock.net/t/1475993/evga-step-up-your-gtx-780-to-6gb-update-6gb-gtx-780-ti-incoming-aswell#post_21987381

The Kingpin will probably cost more than a Titan and not be an ordinary ( read: priced decently) card.
Last time I checked nVidia allowed no partner apart from EVGA to sell 6GB Tis.

That’s quite an exaggeration. Even a 30" monitor at 4k from a foot away has clearly discernible pixels.

(much how the color race for games largely stopped at 32 bit because it displays far more color values than what your retinas can register)?

It didn’t actually stop, nowadays math is done in at least 16bits/channel, often in 32bits (single precision float). Even if the eye couldn’t discern 16 million individual colors, mapping an additive color system uniformly into 8-bit/channel is far from enough. It’s quite easy to discern banding on todays 24-bit displays, especially with grey values (of which there are only 256). Unfortunately, 48-bit display technology is exclusive to the professional segment.

The human eye has a resolution of approximately 8 megapixels in total. But this is massively complicated by the fact that the human eye is not a digital camera with a constant resolution.

If you took a digital photo with a human eye it would look like this but about 10 times the resolution, the center bit being 2072x2304:


Averaged out across the entire retina you will get about 8 megapixels but if you look at a persons face from about 2 meters that area will have 7 of the total 8 mega pixels of density. This is because inside the eye there is an area called the Fovea centralis which is a massively dense area of the retina that is placed centrally to capture high details at a distance. This is further complicated by the fact that the eyes “dart” in order to capture different bits of data with the high resolution Fovea inside a few milliseconds, the brain then composites this into one image that you see . This allows you to see at massive resolutions far beyond that of the actual eye’s average pixel density.

So if you were sat about 1 meter away from a 21" screen a single icon would have to be over 7mp before it would exceed the eyes density which obviously a monitor with a fixed 5K resolution could not do.

You may want to check out this video for some more infomation and explanations its good:

Made me laugh…

Is your mind blown yet? If not, we can take care of that – picture two of these cards running in tandem. BAM.

:smiley:

well price to power would be the 750 ti you can get 3 for less than a 780 ti… 750 ti is as fast as gtx 670 if not faster and it only uses 60 watts costing only 160 dollars

I have a dual (in fact two 780s for rendering and a 580 just for the monitors) and it works nicely.
At some poinnt I did have two 780s and two 580s on the same machine. While the render times were indeed shorter, the power draw and heat was a real issue. (I had to do some heavy rendering in summer and the air condidioning bill came with a vengeance…) So I eneded up borrowing a computer and rendered in two machines separately…

If you can afford two high performing GPUs go for it.