Time to bring UE5 hype to blender (with real-time path tracing!)

Hey folks,

I’m quite sick to see all this UE5 hype lately, all thanks to Lumen™ or Nanite™ :yawning_face:
It seems that most 3d artists are forgetting their basics, there is a solid alternative to Rasterization, it’s called path-tracing!

And this idea is not new large renderers such as Octane Brigade or Vray Lavina been working on this idea since a few years already, and most recently, this developer, is currently developing his own real-time GPU based path-tracing engine

When do you guys think this technology will do a comeback? :slight_smile:
Was there a reason on why Vray/Octane never released their prototype engines into the wild?

Except for clarisse iFx no other DCC’s been using the power of path-tracing to work in the viewport it seems… Time to revive that trend IMO, solid path-tracing engines could revolutionize the lives of many 3D artists


Actually Chaos released Vantage (ex Project Lavina) on December 2020
Here the changelogs


Cherry on top, it seems that nvidia is highly interested into reviving fully path tracing engines
this is speculation

I get the sentiment about hyperbolic hype trains, but I also think there is a solid reason for the hype and that is the quality of the end result.
Lumen is ok, but Nanite is the big fat happy pink elephant dancing nude on the table and everybody loves what he/she does.
Yesterday I pulled some Megascan assets into Blender just to have something nice looking in the scene and I accidentally loaded the 4K textures with it and Blender was already slowing down with 5 such assets in the scene.
Meanwhile in UE5 I have 20x the amount of polygons and textures loaded and it doesn’t even flinch.
Even Houdini slows down significantly if you stuff it with enough data, the only software that can keep up with UE5s ability to handle massive amounts of polygons is Clarisse.

In practical terms this means with UE5 I get the power to do 50 iterations more in the same time with only a low decrease in quality (context depend of course). YMMV.
It cuts out TONS of waiting time that the artist can then use for art direction which (IMHO) always triumphs over raw technological advantage.
Mind you, I am judging / using UE5 strictly as another offline Renderer, IDGAF about real-time.
Lumen is not the future (but the present) - an intermediary step before the whole realtime-industry gets to the actual future, which is path-tracing.
No doubt about it.
You know it, I know it, they know it, everybody with a working brain can see it coming from miles away.
Its just that Nvidia hyped up their proprietary raytracing tech without having the performance to back it up.
They blew their load too early and the customers looked at them frustrated and said this is all you got?

Then Epic delivered an actual technical advantage that is usable and the former frustration turned into hype like a pendulum swinging into the other direction.
Let them have it, I say. The final judgement will be the end result, everything else is just noise.


I think the issue has always been the noise. Even in the demo you posted, you can see a clean up when he stops moving the camera (the renderer is catching up). Nanite is awesome and Otoy is planning on adding Nanite like functionality into Octane. It’s basically streaming polygons from an SSD with some kind of scaling/culling of polys.

Having denoising helps a lot, maybe we will get there soon?

1 Like

I see what you did there! :smiley:


It wouldn’t matter to all these artists switching to unreal who do not really need a 100% real time experience, even If this small noise take less than 5s to clear that would extinguish all these urges to try unreal engine :roll_eyes:

It’s not new AFAIK, ClarisseIFX already did that right ? Memory management is for sure another challenge

Having denoising helps a lot, maybe we will get there soon?

I sure hope so, if the DCC’s don’t move their asses game engines will gain too much momentum im afraid. It’s time we fight back with some proper non-rasterization technology :slight_smile:

*I was actually seriously thinking of raising some money to implement a path tracer in blender :thinking: I even contacted this dev to see if he would be interested to port his engine in blender last week.


I’m not sure about ClarisseIFX, never used it. I think the key part of the technology is that as the polys are streaming in from the SSD, they are decimated/scaled so that no matter how close or far away they are, they are never smaller than a pixel. I have no clue as to how they do this, it’s the magic part of Nanite.

Having every poly never smaller than a pixel you can see the advantage with GPU rendering. Even 4K screens will never have to process more polys than there are pixels on the screen. Most GPUs can handle that load with no problem at all. How they do this in real time is another question? :slight_smile:

1 Like

If there’s ever a reliable, production ready, realtime path tracer, UE5 will likely be the first software to have it :smiley:

They’ve done much more than any other entity in terms of bridging the gap between offline and realtime renderers. UE5 has a path tracer. It’s not realtime by a definition of getting a final quality frame under 33 milliseconds, but it’s interactive. It’s only matter of time, in terms of hardware and software progress, before it moves from interactive to realtime :slight_smile:

Anyway, the demo linked above is not very impressive. Shiny objects (such as cars) surrounded by a low contrast low frequency HDRI map on a simple plane are one of the easiest scenario for any path tracer to render.

Even CPU path tracers from 5 years ago could render this nearly in realtime on something like a threadripper. But when you get to a true production scenario, such as camera deep inside a lush forest, the realtime aspect falls apart pretty quickly, and requires trickery to remain realtime. Epic has mastered this trickery in UE5 with Lumen, but at the cost of lumen not being a path tracer :wink:


I’ve been re-interested in UE lately, especially since the introduction of 5.1.

What I still despise however is the UI and their file structures that seem unnecessarily complicated.


Then it looks like any DCC’s for environmental art is dying for good :face_with_thermometer: and we’ll all need to join this Fortnite hysteria. Ugh.

If nothing happens, that’s it…

Fortnite is just a game, Unreal Engine is a professional tool which was used to make that game, among many other games, and non game CG applications. UE is successful, but its success is not at the expense of Blender. Actually I’d dare to claim the contrary, as Epic were the very first corporate sponsor of the new Blender dev fun, and were the ones who set off the avalanche of other corporate sponsors joining.


That’s a joke, referencing the fact that UE got there thanks to a kid game

Depends which area, concept art / environmental art… yes it is unfortunately :pensive:

Ah, yes… well… You make it sound like it was just a luck. But I’d never blame anyone for being successful. They got the right idea at the right time :slight_smile: What makes them unlike most other companies is that they decided to share part of the wealth they made with Blender foundation to support the development long before others did.

What you mean? That’s not Epic doing that. It’s the artists having their own free will and choice to choose a tool that better suits what they need to do. The only way to revert that would be to stop them from doing that against their will. Blender is a good tool, but why would you be displeased with anyone making even better tool?

There’s nothing for you to lose, except maybe your time investment in learning a tool which may be becoming obsolete for certain types of tasks. But that’s honestly a bit selfish view :slight_smile:

1 Like

Nanite is mostly unrelated to the raytracing part (Lumen) if I understand correctly. It’s a way to virtualize geometry by allowing very hi-res geometry to be streamed and decimated in raster space, some kind of mipmapping but better, and for meshes. I also don’t really agree with the martial tone, these are all fascinating domains of research and there’s no need to pitch one against the other, I think.

Concerning mr Komarov’s toy path tracer, yes it looks really good. I love following progress on people’s toy renderers, there are many just on twitter.


Even more funny - I did it unintentionally - my subconscious picked the phrase and it might have been in on the joke though. :laughing:

AFAIK its the same principle as what USD is using, called deferred loading or lazy loading.
It only loads/streams the chunks that are needed for that part of the image the renderer is touching and this technique sidesteps the RAM or at least uses it much more effective.

They also use some form of compression method for 3D meshes making them much smaller and from all that mesh data they cull invisible polygons and generate a polygon soup that contains all the 3D meshes in the scene.
Parts of this mesh is then remeshed (on the fly and per frame!) if the distance to the camera makes the polygons smaller than pixels.
Advanced tech that looks like magic.
I wonder how it would be if DCC’s get that type of tech implemented.


I generally agree with the sentiment, however, watch this - Irradiation, followed by - Irradiation - Process. Until path tracers reach this level of real-time for massive scenes, massive credit where it’s massively due. UE for now gives artists the ability to create content on such a professional level at such a convenience for real-time art direction, that I don’t think it’s fair calling it “hype” at all.


There’s nothing for you to lose, except maybe your time investment

We’re talking about blender becoming obsolete in environmental rendering, isn’t that a big deal? :thinking:

1 Like

Blender is useless for Environmental Rendering, always has been. Every environment artist I’m aware of uses UE or Unity. It’s not going to become obsolete- it always has been obsolete. Until Blender can handle grass in real-time, it’s simply not part of this discussion


There must be some preprocessing of the polys that you need to do for it to work right? I haven’t used Nanite yet so I’m not sure of the whole process.