Things EEVEE will NOT be good at

While I would love a LOD system as such, citing Eevee as the reason for it just doesn’t make sense.
There is no such thing as ‘overloading’ a GPU. You can task it with rendering a scene that doesn’t fit in the VRAM, but that will just severely decrease performance. If it crashes, that just means there’s a bug.

In theory at least. In practice software does crash when you push the envelope, but only because these heavy load scenarios tend to be less comprehensively tested and debugged, and not because of some kind of ‘overload’ or any such thing.

Ah right. “Overload” may not have been the right word.

Grinding the performance to a halt would be the big issue - which is what I agree on and was referring to. :grin:

Having the system grind to a halt on performance often will crash the software or make it suspend at some point, regardless of bugs or good debugging. It would also make the end render more unstable if without. All render engines have this issue, and LOD’s are just a method to mitigate this.

LODs are a system that lets you move your camera from one point in your scene to a different point far away fluidly, without sacrificing performance, but actually taking more memory as long as your objects are non-unique and appear at different distances. That’s literally all it’s good for. It isn’t a system that mitigates RAM usage.

In open world games, these systems are more advanced of course and stream various objects and LODs in and out of memory, but such a system requires extensive preprocessing and would probably be beyond the scope of what any Blender implementation would do.

That said, a simple LOD system would be quite nice in Blender for testing out LODs for game engine assets. You can of course build one yourself with scripts or drivers or whatnot, but it’d be nice to have one built in.

1 Like

In cinema, the idea behind LODs is called forced perspective.

Cycles has caustics, but you have to set filter glossy to 0.1~0.2, use branched path tracing with high AA and diffuse samples, and even then is not that accurate and slow AF
EDIT:Someone reaplied already, my bad

1 Like

we have code built into upbge for LOD, it’s quite useful, however there is another technique that can make everything draw much much faster,

if objects all use 1 material they can be rendered in 1 draw call, I am not sure how hard this is to do with eevee - however with a streaming megatexture everything static in a large moving scene can be 1 drawcall batch.

If the only way to make caustics take less than a lifetime to render cleanly is to blur them out, that’s a pretty big caveat.

Granted, I don’t think caustics are even noticeable for 99% of scenes, and the user demand for them is blown out of proportion. That being said, if you are suggesting filter glossy as a solution to getting clean caustics, the caveats should be made clear.

I think with water + waves caustics is very important.

maybe have a ‘caustics modifier’ in the ocean sim that you give it a light source, and it makes the caustics with math?

(you can add another and give it another lamp etc) ?

for stuff like glass w/ lots of facets etc it’s trickier

Correct, scenes with water and sunlight are one of the 1% of scenes where it matters.

Such thought process always reminds of: “1 KB RAM is enough.” Then it was 1MB… 1GB… 1TB… and so on.
All because of personal perception & useless misconception: “It’s hard and it’s long process and tedious to make and impossible to have in real time…”

Rrr-ight. Simplify, Optimize, Fake… or even RTX-ize, Vulcanize :grin:. Use your human ingenuity, apply Brain.

WebGL: http://madebyevan.com/webgl-water/

3 Likes

Ahh, so that’s why it’s called a “realtime renderer”.

Erroneously.

I was talking about Eevee, not upbge. LOD makes all the sense in the world for an engine that has to render a frame in 16ms every time.

Officially.
EEVEE is a new physically based realtime renderer.” (https://www.blender.org/2-8/) . And also basically everywhere online where someone talks about Eevee. Which raises expectations (ie: Unity- or Unreal-like engine).

It does not deliver realtime performance, nor is it meant to, unless you count seconds per frame as realtime. It’s meant to deliver the highest quality possible with gpu rasterization no matter how long it takes. It uses some technologies common in realtime rendering, but that’s about it, no matter what someone absent-mindedly wrote on a website somewhere.

Look, I’m not really interested in arguing about it. If you feel so strongly about calling it a realtime engine, you go on and do that.

this is my fake caustics, I use the same vector that is offset in the noise for the waves, for the caustics.

I think one could use a projected lamp and the noise pattern to ‘fake it’

about batching , I think*** that eevee could potentially do draw batching and display much larger scenes in real time. (it’s glsl in the end no?)

1 Like

The goal of Eevee is to produce an engine capable of delivery very high quality results in a very short period of time (to the point where realistic animation can be done on home hardware).

That means the engine uses a few rasterization techniques that can’t be done in Unity and Unreal because they are just too slow to compute on modern GPU’s. Now you can tone down or turn off some features if you really need 60 frames per second.

Here is a fake caustic effect applied to an ocean modifier object in cycles:

4 Likes

Haha, I yes I don’t want to argue with you either.
It’s just something that has bothered me a bit, that Eevee is marketed as a realtime renderer. Which it isn’t. Problem is, it’s not something that someone wrote absent mindedly on a website somewhere, but is consequently called realtime by the blender foundation on the official website. Which I think causes some confusion in users who then actually expect realtime results a la Unity/Unreal.

Now, to get back a bit to the topic. I actually do have a use case where I use Eevee (almost) real time. I’ve been testing the Blender XR project at work for VR-visualisation. If it worked well enough, it could simplify my “Blender to Unity to VR” workflow and eliminate Unity completely. Now VR glasses have higher realtime requirements: two frames at 90 fps so basically 180fps. With the proper hardware, I can get acceptable results on small scenes. But with increasing geometry, the framerate drops. More optimization here could make Blender a powerful VR tool.

Unity is a realtime engine, as is UE4. If I make a scene so heavy and full of post processing that it runs at 2 frames per second, it doesn’t mean that Unity and UE4 should stop calling themselves realtime engines…

2 Likes