Unreal Engine 5 Feature Highlights | Next-Gen Real-Time Demo Running on PlayStation 5

4 Likes
1 Like

crazy to think the days when we had to retopo high poly sculpts might be over. somehow i can’t quite believe this might be our new reality

4 Likes

wow wonderful …

I could be wrong but Nanites sounds like some form of adaptive subdivision + displacement.

1 Like

I was just going to post this. I’m blown away. I knew UE would be at the forefront of true realtime photoreal gameplay, but I didn’t think it would come next gen. I just don’t understand how the software and hardware are doing this…such a huge leap over what we have now. The new code, and use of RTX architecture (or whatever) is mind boggling. They’ve cheated reality! Screw you Moore’s Law!

I am still not sure we’re far enough from April 1st for this to be true… sounds amazing.

1 Like

Triangle count not a issue anymore with UE5, same with Lighting. More reason to get some sort of live link between Blender and UE4/4E5 that’s one huge bottle neck in development.

I’ve just seen this too… this could only get better if they say that they also can import Blender objects directly with full material/shader support :rofl::slightly_smiling_face: :neutral_face: :face_with_raised_eyebrow: Can they?

Funny just yesterday I saw CoD: MW on Reddit and this is going to make it a lot worse …

Edit: for those to lazy to follow the link, it’s a rant about the game size being 180gig almost half the size of the internal storage of the ps4

1 Like

Ya, storage is going to be an issue for a bit…but SDD tech is moving so fast, and harddrive space is getting cheaper and cheaper and smaller and smaller. I assume since file sizes aren’t going down, the size of storage will have to.

As they have said in the demo they’re using megascans & photogrammetry because painting softwares like Mari & Substance can’t handle this amount of polys even small assets take loads of memory, but i am wondering how the zbrush sculpts( 33 mil) were done probably textures projection.

3 Likes

It’s not only that. Download speed is an issue as well . It will mean the price for the game being higher on average than the advertised. Since you will have to pay for higher internet speed assuming it available in your area and off course paying for extra storage every time as yours game collocation keep growing…

If right now you can have a game that 180gigs then you may end up with games being in the 1Tera range in the near future specially since not optimising your assets means faster development and simpler pipelines and lower costs this is going to be abused to hell and back lol

3 Likes

that those extreme geometry resolution they could use a vertex color art pipeline with layers and use that for the asset.

SSD storage is okay, 2TB might seem tiny for next gen games running UE5. Pablo is already working on vertex painting, I wonder if its possible to have blending mode and layers with vertex painting?

all of this while blender got arrows in knees while above 100k polys :clap:

5 Likes

Which naturally brings us to the next area to develop: procedural assets, both textures and models - assets to simulate high resolution mega scans that are actually a procedural model. I can see this being used for grain grass forrest etc. With that games could become small enough to fit on a usb-stick (the smaller ones )

This is only static mesh assets. Characters will still follow the same pipeline, albeit with more detail and more advanced rigging.

Unreal Engine uses its own material/shader system so this won’t be directly transferred. The best possible scenario for this is if Epic/BF adopt MaterialX workflow. Shaders have to be converted/compiled between different systems. OSL could be the bridge with MaterialX. Objects can already be imported directly, but currently .fbx is the bridge and all its limitations apply.

NVMe drives will be the new standard.

Knowing how Epic build their demo assets I’m guessing the sculpts make heavy use of complex UE material editor shader graphs. UE mat editor is as powerful as Substance Designer. High poly assets can be unwrapped in Zbrush UVmaster and colourID maps exported to UE material editor.

Textures(the largest memory footprint in games) may well be streamed directly from the cloud. It will be interesting to see what size it is when they release this demo to the public.

1 Like

the new flight simulator uses a streaming system and to be able to play the fully fledged version you’ll need to be constantly connected to stream data from azure. Honestly it’s also something impressive. NVMe cost per gb will go down for sure, but I think the future is the streaming with cloud platforms. I don’t even want to imagine the size of a big game created with that pipeline :smiley:

Internet is faster and cheaper (in bigger cities) with every year. Last month I tripled my net connection (to 300 Mbs) and lowered the cost of it at the same time. For the last decade I never paid more for the net than 15e monthly. Despite this, I agree that internet connection can be a problem in remote locations. And the storage problem is even blarger- hard drives are getting faster, but the storage space is not getting bigger, we have a bottleneck here.

In a lot of countries it is mostly ADSL with data caps specially 3rd world countries but even in America some areas don’t even have That…

2 Likes