Unreal Engine 5 Feature Highlights | Next-Gen Real-Time Demo Running on PlayStation 5

With visual quality like this, future for offline pathtracers, raytracers doesn’t look very bright. Man, at some point while watching i even thought “what’s the point in waiting for renders, when you allready can have it realtime”. I know it’s not that simple ofcourse, but i don’t think offline rendering (at least for entertainment industry - games, movies) will last long. This demo is clear sign of it.

1 Like

We use Vray for product renders at work, and for me it was a toss up between when Blender would get RTX or when Unreal would get more features in their RTX for when we would switch…

…it’s not a question anymore. Autodesk and Chaos group’s days are numbered. Not sure I’m even going to have to use Blender at work, since we import straight into Unreal with Datasmith these days anyway. :smile:

1 Like

I live in Portugal, in a populated area and I have internet at 2Mbps… so it’s not only 3rd world countries…

1 Like

@bsavery I don’t think they use maps at all. For maps you need UV unwrapping. Unless they use some automatic unwrapping tool…
I belive that ue5 uses per vertex paint with live mesh streaming and some advance GPU occlusion culling. It’s posible that PS5 has hardware support for this occlusion technique… and probalby, hardware decompression algorithm
Also, modern SSDs are a key component here for fast streaming

Apparently draco does a great job at compressing geometry. 96% is a lot for a 3D mesh

I also think procedural materials (or semi procedural using low rez textures patterns or tiles) calculated on the fly will be more and more used to lower a lot file size. Perhaps some IA texture construction from a very low rez version could become a new feature.
Textures is what costs the most about storage.

While it’s cool real time games beeing able to reach quality comparable to some CGI productions.

So it’s all procedural! that makes it more impressive then.

@Musashidan IMO, it’s a lot simpler than that. They use only per vertex paint. without any kind of mapping. no uv unwrap, normal, AO, displacement, etc. Even hard surface meshed are very high poly
The mesh with vertex paint is compressed with draco or similar https://github.com/google/draco and then streamed and decompressed in real time.
The new GI is taking care of shading AO etc. Also they must use a GPU based occlusion culling. For that to work they need the convex hull of the mesh(low poly).
The only thing procedural and dynamic in that scene is the GI, animations and the particle system, everything else is static and instanced
Actually, this might occupy less space on the SSD than regular games, but it will be more CPU/GPU intensive than usual
The key component here is the SSD, which allows for fast streaming

1 Like

After the UE5 demo all I can smell is burning, it’s the smell of Maxon’s investment in Redshift going up in smoke.

The PS5 GPU hardware will be comparatively mid-range when AMD and nVidia release their next-gen line so Big Navi and 3080TIs will bring massively more processing to the table so far fewer compromises have to be made. Combined with the tech of UE5 and other realtime renderers this could be a watershed moment for freelancers and small shops who are not afraid to adopt new tech to punch above their weight. For a lot of projects a single high-end GPU will out render a farm of GPUs at an acceptable quality for a lot of projects.

Using Eevee in production for about a year has taught me a lot about the liberties you can take and how many the client will accept if it means a quicker turn around or a lower bill.

Once USD provides a slick high fidelity interchange between DCC and UE (or other RT engine) watch the quick death of offline rendering, I’m not saying realtime renderers will completely replace offline renderers for a few years but they will take a massive chunk of the rendering market in a very short space of time.

5 Likes

How would per vertex paint work exactly? I can’t see a way how one can have a proper texture map without some heavy constraints and assumptions using only vertex coordinates. It’s quite clear that they still use texture mapping as you can see some of the triangles are big enough but in the rendered view they don’t show as single triangles, but are texture mapped.

They said in the demo that all quixel assets used 8k texturemaps.

Many assets are from Quixel that is polygons with Uv mapped textures, they say it in the video they use 8K textures.

Because millions polygon models with per vertex vertex would be too much storage size i think needing to store color, roughness and other data, i’m not sure that would be as efficient as textures.
I don’t think there is any official Epic source saying they would use vertex storage.

It is both ways. low poly with maps and high poly with per vertex color. The hi poly was generated with photogrammetry or zbrush.
I don’t know how they did the vertex paint at 30Mil. probably transferred from low poly with maps back to high poly. At 8k there is no difference in memory consumption between
low poly and high poly. If you have one triangle per pixel…
Also for low poly meshes they can use triplanar materials without uvs

Anyway, 30M poly mesh in RAM = 580MB vertex pos(12 bytes) + pbr material(8bytes) so, you can load up to 20 very high poly meshes in 10 GB of vRAM :slight_smile:

So the close to camera stuff is texture mapped, but the distant stuff is dynamic LoD with vertex colors. Yeah, that makes sense, if that’s what they’re doing.

Triplanar cost lot more GPU than pre made UV, i think they used automated UV baking for some models.

Again they never affirmed using vertex data for material storage, nothing official; you are only guessing.

obviously, I am only guesing. but they said that they use very high poly meshes, live. it only makes sense to encode the material at vertex level. you get rid of some maps in the process, but of course this might not work very well with riged objects. It could be both ways: high poly with auto uvs and maps…

Also the hardware needed to run such demo does not exist on PC :

“The storage architecture on the PS5 is far ahead of anything you can buy on anything on PC”

Sweeney isn’t saying that you can’t get a comparable M.2 drive for your PC, even now if you want to shell out for it. Rather, he’s saying the custom drive Sony created and the way it interacts with the overall PS5 data management system makes it faster and more impressive from a development standpoint that anything a consumer could readily buy today, especially considering PC developers aren’t yet building games that take advantage of such speeds. That may change in the future when both new consoles arrive and, as Sweeney predicts, inspire significant upgrades to PC component design and PC-specific game development.

As I’ve said. the SSD is the critical component there. 5.5GB of input data per second uncompressed.

no, it’s nanomachine son

We’re talking about the raw Zbrush sculpts, not the Quixel assets.

[

I’m just going on my knowledge of studying Epic asset material trees. They have insane graphs. Heavy use of colourID mask maps(vertex colour) and materials are always built to be as customisable as possible.

It costs nothing to remesh/unwrap/reproject in Zbrush. It will be interesting to pick these assets apart when this content is released to the community.