Perhaps this leads to the development of better compression methods.
The money of gaming has driven quite a lot of technological advances.
Perhaps this leads to the development of better compression methods.
DKesserich - thanks to super resolution feature scenes runs like a gold on my old 2060super.
Thanks for the link, I totally missed that.
It seems there are no tools in place yet assisting artists to reduce the size of the assets. So, in reality, it is still not possible to use the highest quality assets in production (as advertised). They need to be scaled down manually.
No read again
You drop your million polygon obj file in UE5 editor asset browser.
UE5 will show the import window, that contains the new option “convert to Nanite”.
Choosing “Nanite”, the import process will compress your object data already before be able to use it in the editor.
Why asking , why don’t you try it and you’ll see for yourself ?
Also it’s Early Access, vertex color import is not there yet, Nanite does not work with some shaders, it’s really for curiosity.
You should stop loosing your time with Unreal 5 until you plan to really work on some full next gen game ( PS5, XSX, high PC ), or you are doing movie production, or you need real time million polygons static rendering like cars, buildings or anything else.
My impression is that you need to read again what my actual question was. This was not my question…
Right now, I am curious how far it has come. There is no point in downloading it if it isn’t at the point where I would want to experiment with it.
So getting an understanding about how they are adjusting the workflows for their new technology is not an okay thing to do?
The engine itself isn’t what I would categorize as light anyway. It’s a AAA engine and it requires pretty hefty hardware to run. However what’s considered high end hardware now will be midrange in a couple years which was my point. That’s how technology works. If you bought 2080 a few years ago, the 3070 can now get in the same ballpark. Next years 70 series will be even more powerful and the next one after that the same all while making the 60 series more powerful as well.
When UE4 first came out everyone used to complain about how heavy the engine was (compared to Unity), now it’s not even a factor because any video card made within that last 4-5 years can handily run it. The engine is still heavy, it didn’t change in-terms of requirements, in fact those increased, but technology caught up and now you can use UE4 on a decent laptop.
As for your comment about optimization. Epic’s launcher is not UE. UE is made with developers in mind, and they know that developers want to eek out as much optimizations as they can and they provide plenty of tools and optimizations to do that. So I don’t really see your point there.
That’s how technology worked up till now, yes. But right now that doesn’t seem to be the case. People want to upgrade, but can’t. Not if they like their kidneys at least. I’m not convinced the next generation won’t have similar supply issues, and I’m not convinced there will even be a sizeable market for a high-end PC game in a couple years time. Nobody will be able to afford it.
That said, the engine seems to work well enough if you skip Lumen. And Nanite can still be used with traditional low-poly, or even mid-poly meshes. So maybe that’s a path worth considering.
I mean it’s early access, many things are missing, workflow is not fully developed.
It’s a work in progress engine, mainly big companies use it because they have source code and teams of engineers to fix or improve things, and their goal is Next Gen (perhaps using fallback rendering for other consoles).
For indies teams and solo, it’s demo, until you could not make some game without movie quality.
But feel free to download it and test it somewhat
Bonus some workflow example
When Unreal 4 was realeased, it was vey heavy on hardware.
Thousand of optimisations only appeared through versions lot lot later.
What helped a lot is Epic working on Paragon and Fornite about optimizing content and graphics.
While hardware and prices catched up, so UE4 could run fine on entry cards like GTX 960.
Perhaps this will be the same when UE5 will be production ready.
- Heavy hardware requirements for Nanite and Lumen at release
- Progressive Performance optimizations through versions releases about Nanite Lumen
- Hardware prices catch up,Nanite Lumen running great on entry cards RTX 2060 or RTX 3060.
I don’t see why it should be considered good practice to make users always upgrade their machine (if they want to get the same performance for producing the same results they got a few years ago). This is using Moore’s law as an excuse to add bloat and ignore deep optimization work, which is why so many big name software solutions are bloated, buggy, prone to crashing and corrupting data, and slow.
As for Godot’s GI solution, the way it works should allow it to be scaled up as hardware gets better (such as increasing the resolution and distance of the cascades as well as increasing the samples). The current implementation is designed to work fast with a GTX 1060 card and above. To note, it is not like Lumen can currently produce film-quality magic at 60 FPS either (see their forums for current issues).
This is why Fornite and Paragon helped a lot about optimisation.
Unlike all other game engines not making their own games.
If it was so bad and not optimized and so buggy there should not be as so many indies would use it, even on Switch and mobile, and beeing successfull publishing their games
Unreal is about AAA, so you should relax and stop feeling attacked by Unreal 5 power
The tools are heavy artists oriented and graphics reach movie quality, it’s just another level.
The discussion is not about comparing with Godot, Unity or CryEngine.
While you can make your own thread to compare ( and try to run same levels complexity in Godot or Unity or whatever you use ).
This is alpha software. You don’t actually have any idea if the engine will be bloated or not in it’s final form or even 5-6 versions after its released. Unlike Godot, Epic actually uses their engine to make games and has real AAA studios, Indie developers and other industries using their engine. I think they know what’s needed or will know soon enough.
As for bloat, have you looked at UE’s code? Do you know this for fact or you just making stuff up because you don’t like the spec requirements for an alpha.
The PC industry lives on extremely slim margins if users start not buying stuff they will respond. The M1 is already making waves because it’s pretty powerful idea of integrating everything into one package, I think that’s where things are going. AMD certainly has that in their sites. Intel’s discreet GPU may actually be good and add another competitor to the market, that may bring down prices and availability of the other two competitors. We just don’t know but whether UE5 is going to be useful to indie developer or not is not something that will happen this year or the next it will happen several years from now just like it did with UE4 and UE3.
The reason why people can’t buy a gpu is because of high demand. Scalping and mining yes, but even then demand was higher than normal as people who are stuck at home wanted to game or if they are a DCC professional needed to build a rig for home use etc. Global chip shortages among other things has contributed to higher than normal demand. With china banning Bitcoin, the mining craze should start dying down. When that starts to happen you will see the market flooded with old mining cards.
Don’t loose too much time responding long topics to people that are really not interested in Unreal or even not using or trying it, and only trying to bash it.
We are not trying to defend Unreal or any other software.
Some people forget a simple advice :
- use the software and tools that suits you
- don’t criticize other people choices
People that don’t like Unreal or Unity or whatever it’s not an issue, each has it’s own preferences about anything.
But they should make their your own post to compare or bash other engines.
I am not trying to make the thread about Godot, I just wrote some information about the GI (which Apocalypse brought up in an earlier post). In addition, just because I don’t worship Tim Sweeny (as I’ve seen some people on Youtube and other sites that treat him as a sort of messiah figure) does not mean I am incapable of doing anything other than bash Unreal.
I do believe Nanite is interesting tech, though the final verdict on Lumens will depend on how robust it can become in cases like indoor scenes and whether you can run it on a wide range of hardware. No accusations here, so please don’t assume I am a simple hater (or suffer from some sort of phobia) because I don’t just post praise in product threads (whether that be Unreal, Unity, Maya, ect…).
Bitcoin is mined on ASICs not on GPUs. Doesn’t mean much anyway. China bans Crypto every couple of years. I don’t really know if they then unban after a while but the “China bans Crypo” news pop up every bull cycle.
But back on topic:
I am currently on my second project in one of these virtual sets similar to the one in that Mandalorian clip posted above. I think the technologies in UE5 will be fantastic for this kind of stuff. Both of these projects are smaller budget productions with me being the only 3D guy on board. Simply dumping a large mesh without optimizing it sounds pretty awesome for use cases like this.
The difference to a game is that it doesn’t have to work perfectly for a whole bunch of different computers. It just has to work good enough once when shooting. It doesn’t matter that much if it takes a couple of GB for a single mesh because chances are that this mesh will be the only one you have to show.
Yes, I know, that’s why I asked! What I asked is a very obvious feature that is needed. My question was whether it is already there.
So? How is that related to my question?
How is that related to my question?
I asked the question to get the information whether that feature has already been implemented, understanding whether it makes sense for me to download it and experiment!
Thanks for that bonus which clearly shows that their marketing went a little too far, as doesn’t seem possible to work with arbitrary sized 3d scans (yet)!
I am thoroughly confused by your answers to be honest. I have the impression you feel the need to defend Unreal, even though it is not my intention to attack them in any way. I am literally asking technical questions to which I couldn’t find the answers. Sure, there is some critique regarding the file sizes from my side, but any tech demo is usually far of from the real world. Those are issues that need to be resolved and I am 100% confident they are working on it! My questions are pretty much about how far they are advanced yet!
I’m not defending, it’s a thread about beeing interested in Unreal.
You could make thousand post saying this thing does not work, this thing is not ready, it’s the same for all 3D engines, some bugs and things not ready.
And it’s useless to point out what is not ready in an early access.
You have early access, just try it, you’ll get updates as things progress.
Do you have some game or movie prohects that needs Nanite or Lumen ?
There is team working on Nanite and Lumen and others on all other features common to Unreal 4.27 that will merge with Unreal 5.
I already imported high poly mesh on Nanite and it’s working well, without Nanite rendering such scene would run incredibly slow.
But i think the more important feature is about realistic lighting quality so Lumen that will work on common rendering common and Nanite.
Lumen is incredibly good depsite not fully optimized and not fully ready.
Don’t criticize other people’s choices ? Criticism is helping one another, I’m not sure why you’d want to avoid it. Criticism is exactly what we’re doing in this discussion.
I asked a relatively simple question. If anyone had clearly answered with “No”, I would have written “Thanks for the answer” and would not have further posted about it. From my point of view, such a question fits perfectly into this thread.
Unfortunately, all the replies I got were everywhere, but not answering my question. Just like the reply which once again diverged to something else. Having a discussion is pretty difficult like that or even impossible.
I’m just testing Unreal 5, i’ve read some docs, but that’s all, i’m not using Nanite or Lumen for what i’m working on and current hardware specs, i don’t know more and i’m not an insider.
The best i can answer is read the documentation or search info by yourself on more appropried Unreal sites.
I’ll just pass on your questions
Back on track.
A game where there is no polygons only point clouds.
PS4 Dreams inspired to make Nanite, it’s clear graphics are evolving and trying new ways to avoid textures limitations , Unreal 5 trying hybrid solution mixing traditionnal rendering with new unimited detail one.