it’s clear that theses last years we see much more quality in CGI ! modelling package and rendering one get more accurate and faster ! but as the power horse of computing increase graphics artists start to demand more and more in terms of realism details in shape great texture very complexe material more polycount. by the end it’s become very hard to be satisfied of speed no matter the monster your rig is. GPU base renderer start to help but noise,stability, very hard programming features and VRAM limits finish to kill them.
But in another side something very strange start to grow : REALTIME visualisation.
I think a good tl;dr is ray tracing is “doing it right” and real time engines are “making it look right”, largely through precomputations (e.g. lightmaps) mixed with cheating (e.g. cubemaps) and sacrificing things that make little difference anyway (e.g. diffraction).
The best game engine out there (Cryengine or Unreal 4 depending on who’s talking) are good enough for realtime archviz, but I can’t see any scenario where they will replace offline rendered CGI for high end uses (like movies). As good as they are (and UE4 is really good), they still lack the realism of ‘old fashioned’ CG since they have to use so many shortcuts and simplifications to reach realtime performance.
It doesn’t actually have ray traced shadows
Maybe Brigade does but still
As i said thing are changing a lot ! but when take a look to Lumion 6 demo video you can see there is all thoses things that make difference between realtime engine and our actual path tracer!
1/realistics soft shadows
2/ high level indoor GI
3/ realistics material (frosted glass, chrome)
i don’t know what tech is behind this achievement but i can said that it is not something like unreal light cache method ! because lumion is focused on ease of use by people not familiar to computer imagery soft like old architect. if it is like unreal it will be double unrwapping for each object that mean loosing lot of time more than average knowlegde in external software package like BLENDER 3DSMAX and close door for professionnal user (Revit for example which is for architect).
Actually, I make very good use of OpenGL renders as the “base layer” of a shot or scene, using other forms of rendering to provide material to composite with that.
It’s very tempting to lavish more attention upon a particular shot than it deserves … and, unless you work out the “final cut” of the scene in advance based on Preview renders, you don’t know what it deserves. You don’t have to show everything and you don’t have to lavish equal attention upon every shot, or every part of a shot. Cinematography and Editing need to come first, followed by an economy of effort in producing what the Edit has called for.`
“2012” was a long time ago in computer-time. :yes:
And, sure … that was Pixar talking about what works for Pixar’s space, which is “feature-length movies that will be viewed on giant screens.” They spend millions of dollars to make them, and recoup their investment by selling millions of dollars worth of toys and licensed products. And, oh yeah, movie tickets.
If you’re doing a project on a smaller machine and don’t have hundreds of 'em, you make very different decisions.
Well I know. Still doesn`t make much of a significant difference.
Also I didnt mean to say every student short should adhere to Pixars quality standards.
But take game cinematics:
Obviously theyre not meant to be viewed in 4k in a theatre (not gonna take the pseudo-4k monitor marketing stuff seriously here, Ultra-HD just isnt 4k by any strech).
Now go look at the SCII Wings of Liberty Intro, from long, long time ago in computer-time (2010).
Hell, even look at the Intro for SC Ghost, which probably never got much polish given the game was cancelled.
I will yet have to come across an ingame-cutscene of whatever game (be it as recent as it may) to touch this, visually.
Usually the lighting alone gives things away as realtime-stuff.
Look at the heaters in the 3rd UE4-demo posted by the op. Consider what actual GI could do for them alone.
Not trying to ague here, but I don`t get where people get their “everything is gonna move to realtime soon”-type claims from. Nobody is gonna move to realtime. Maybe for archviz.