The new Nvidia 780Ti is a beastly GPU, still can't run the latest Crysis at 60 FPS

http://www.gamespot.com/articles/nvidia-gtx-780-ti-review-a-powerful-gpu-with-a-price-to-match/1100-6416004/

So in short, you might be able to get extremely fast Cycles renders with this card, but it almost looks as if it’s confirming that some of the next generation graphics technology has been designed to literally be ahead of their time.

Take a look at some of the gaming benchmarks, apparently in the attempt to cram the latest advances in realtime graphics into games, the standard benchmark that declares 60 FPS being the gaming standard is being replaced by the idea that 30 FPS instead might be considered ‘good enough’. Even this beast of a chip is unable to run the latest games at the highest available qualities at 60 FPS and at best musters 30, and on top of that even shows that the FPS rate will stutter on occasion. (though it would obviously run like a champ at somewhat lower settings considering, for one thing, these results being at the new 4K resolution).

So why again has there been talk among industry spokesmen about lowering the standards of how fluid the gameplay should be at a minimum, because in all measures, 30 FPS isn’t very fluid if your game has very fast objects and you don’t want to hide it behind a mountain of motion blur. Perhaps they should rethink their strategy of pushing resolutions this high because it seems like the hardware is not yet ready for it (unless they want to dial back the level of realism in the graphics but some of the larger studios might not like that idea), and I wouldn’t even start on the idea of having realtime pathtraced games at this type of resolution, something which requires a bit more computing power yet.

So what do you think about all this?

Maybe you should mention that the crysis benchmark was not only done with the highest graphic settings but also on a resolution of 4k.
I think that 4k at 30fps at highest settings are quite impressive for one card so it should be able to play any game in 1080p with out any problem at 60 or more frames by the highest settings.

Depends, i think 30 fps is fine for games like Rome Total War, Diablo 2…, but for FPS is important have 50 FPS or more, mainly if you are playing versus humans.

Otherwise, Crysis is awesome with low quality configuration also.

if 25fps is good enough for cinema, why all the insistence on needing 60fps for a game?

The monitor displaying the game has a refresh rate of 60 Hz

I think that this is just the tip of a greater iceberg,

look at http://www.itpro.co.uk/strategy/20382/ibm-unveils-new-programming-language-synapse-processors

Things are about to get crazy,

I give it 5 years before we ditch the current CPU/Gpu separation and they merge into a Spu -(synaptic processing unit)
and then another 5 before you can design your own custom tailored solutions (ie gaming VS real time mechanical etc)

The only reason cinema is 24fps is that we’re used to it. The blurriness is expected. In games, we don’t want blur because we want to be able to actually see what we’re shooting at when jumping. We want everything to be as sharp as real life.

Don’t take my word for it, try playing an action game alternately at 24fps and 60fps yourself and see what you like better.

I play Counterstrike quite a lot… and as a result i have bought a 144fps monitor… the difference betwen 24fps and 60fps is light and day… the difference between 60 and 144fps is the same amount… with 144fps… everything is smooth… no need for motion to smooth the frames out.

Our eyes, i believe, can see up to 200 or 300 fps… but as you get higher and higher the difference becomes less and less noticeable.

24fps was only chosen as that was the least amount of frames needed to make something look smooth moving… but its definitely noticble the stuttering going on in large camera moves.

if anyone gets the 780ti please run the Mike Pan benchmark scene.
the 780ti has more CUDA cores, faster GPU and memory clock than the normal 780 or TITAN

so it would be cool to see how it does.

That’s what they said when graphene transistors were introduced, but getting them from concept to proof to mass production and then into an actual computer has shown to be a lot more complicated and thus is one reason why we’re still on silicone (we might still get there, but it may be some years yet).

It’s the same thing for quantum computers and the fact that the optimistic types that have thought we were on the verge have been wrong due to unforeseen complications. (according to Bao2 the government had been using them in secret and should’ve been revealed and on store shelves now, but everyone knows his track record of predictions).

Yeah, but the http://en.wikipedia.org/wiki/Von_Neumann_architecture is not applicable to the real world,
in the real world things happen when they happen,

hence - infinitely parallel asynchronous computing is the future,

Can’t wait for this to get up and running. Means a lot of our technology will become obsolete, and perhaps end up in museums.

No.

Also, while we’re at it: Quantum Computers won’t solve everyone’s problems, either.

I am not speaking of solving one problem, or even one type of problem, I am talking about solving MANY problems at once. Basically anything that is not dependant on another solver, can all be processed at the same time.

So basically instead of tackeling a single problem from many angles (http://en.wikipedia.org/wiki/Amdahl’s_law)
Attacking many problems so there is no “Que”

Sequential operations are still dependant on “A”'s solution,
where as non-sequential you can Solve A-Z at once. Like solving every objects bullet data that is not dependent on each other, (bullet 3!)

Btw - Be happy or afraid, times they are a changin…

That last Crysis game sucked so bad. Only thing it had going for it was the gfx. I’ve been playing games on pc so long I’ve gotten in the habit of disabling certain features right off. I have no idea if my system will render high shadow quality or not, or if it will handle volumetric fog. I view those things as totally unnecessary to my gaming so I never enable them. Stuff like that I just don’t need. I’m currently playing the new Assassins Creed and heard a lot of crap about how intensive it was. I am running it on max with only the volumetric fog turned off. I mean that crap will slow down anyone rig.

I love taking fallout new vegas and adding all the detail mods to it, while keeping 60fps (I miss my GPU :()

But I had it looking almost real with
Lighting enhancements package
Realistic weather
Hd texture packs
Lod smoothing
and a few others,

the point is, if you want immersion in a game, go for realism,

Us coders however “Pierce the veil” because we are thinking “how did they fake it” not “There is a man with a chainsaw”
So fooling us takes some serious talent.

And imagine playing crysis on this - http://io9.com/a-sneak-peak-inside-google-and-nasas-new-quantum-ai-la-1444007785

then the enemy would make you wet yourself or teach you calculus…
(I am not sure which is scarier)

If you’re meaning NASA’s D-Wave quantum computer, the whole ‘quantum’ part of it has shown to be somewhat of a misnomer, it might indeed have some architecture that follows some quantum principles (which leads it to solve some problem types much faster), but it’s not yet to the point of being a full-fledged quantum computer.

So when you look at the facts, it won’t be beating your desktop in terms of most problem solving at the moment and it requires many layers of shielding and super cold temperatures, it’s going to be a while before they get to where you can buy one for your house.

It seems like many people are falling into the trap that assumes that quantum computer research in its current state means it can run every possible type of application at the speed of light compared to today’s desktops, but even the D-Wave is not able to do that and is mainly specialized for certain types of calculations.

Game immersion does not necessarily depend on the most realistic visuals at all - our brains are symbol oriented on a very basic level, and hence, a book with just words can convey a world / story in a far more convincing manner than a movie or a 3d game with outstanding visuals.

The problem with realistic graphics is that at some point the uncanny valley effect comes into play when visual elements (especially humans and animals) start to approach that threshold between ‘symbolic’ and ‘real’. With symbols our imagination comes into play, while with (too) realistic graphics no room for imagination is left, and things can begin to look ‘odd’.

A simple looking 2d game world like “Rogue Legacy” (which I am playing now, and it’s a hoot! Very addictive!) is completely believable because the graphics remain visually symbolic. I take that simplistic player representation at face value - while a 3d character like Lara Croft in her latest game just does not invoke the same emotional connection. (Not for me, at least.)

That’s not to take away from the fact that realistic graphics can indeed greatly assist game immersion. The devil is in the details, though.

Immersion is all about consistency. Consistent levels of detail/effects and game behavior. You can become immersed in old nintendo games. Play Chrono Trigger or one of the early Final Fantasy games. Realism is relative. What you see often enough becomes real to you. It’s when games glitch or have uneven detail, like when a game has great environments but piss poor character models, that the illusion of realism is broken. ANYTHING that reminds a player that they are playing a game is a sin in video game design. Some of the games I’ve played lately like, Crysis 3 and Assassins Creed III, are beautiful games but have no real depth to them; nothing makes the player want to continue playing past the end. I’d trade all the graphics for a deep rich story line and characters you can actually relate to and care about. Game design is a juggling act being preformed on a tight rope. You have to balance what can be done with what should be done. You need to keep all the areas of the project in sync. All the while keeping a steadily progressing forward.