So far 2.8 (more specifically EEVEE) seems like it is turning out to be a monster. I am loving using it (before the crashes).
I only have experience with 2 render engines (Cycles/Renderman) in my short 3D career. I have never even touched Blender Internal.
Because I only have experience in raytracing engines I would like to know things that are NOT advisable to do in EEVEE.
I recently found out that mesh lights will not be possible in Eevee, and ACE_Dragon mentioned a few more (Eevee also does not support true soft shadows, random walk SSS, and mesh shaped volumetric shading).
What I would like for this thread is to gather the things that Eevee will be weak at or cant do at all, so we can find workarounds or know that we will need to use Cycles for a project.
I understand that Unity is a little more advanced than Eevee (which is understandable) but what features can it do that Eevee will not be able to do? I am assuming in the years to come Blender will take some of the features from Unity/Unreal So I can look to those to see what might be next for Eevee.
That’s not true. Eevee has support for indirect light bounces, soft shadows, SSS and meshes shaped volumes.
The thing is that is not same support than in Cycles. It gives less freedom.
These effects may require more set-up, more memory used, limited camera moves for a lower quality than using Cycles but for a faster rendertime.
So, these effects have to be used sparingly with EEVEE.
Actually, EEVEE is under construction. At early states like this one, nothing is definitive.
The only way to really know what the engine is capable of, today, is to use it and practice.
To try to accomplish something corresponding to our needs.
But if it does not, for now : it does not mean that situation will persist.
Announced goal was to, at least, restore everything BI has and to take as examples Unity and Unreal Engine.
Will EEVEE in 2.80 satisfy your needs ?
Just be patient.
EEVEE section of 2.80 release log will be filled one day. https://wiki.blender.org/wiki/Reference/Release_Notes/2.80
EEVEE is a different engine than Unity. It will not have exactly same features. It will evolve in a different way according to support of its community.
The hard part is when those effects are combined. With a multi-pass approach like in Eevee, you either do SSS before GI or GI before SSS - but not both at the same time. A path tracer can easily calculate light contribution that bounces off a wall, passes through a volume, scatters through jelly and then illuminates another wall. That would be very hard for Eevee or game engine renderers.
Eevee CANNOT DO real soft shadows. Eevee can do filtered shadows (and it calls these soft shadows), and limited screen space shadow casting. Soft shadows are a side effect of lamps with an area greater than zero, where shadow umbra and penumbra are automatically calculated (or rather sampled) based on the size of the light source. There is currently no robust way to solve this problem at anywhere close to real time speeds, although there are some very promising papers being presented this year on the subject.
Rant aside, we should be very careful in separating the terms “soft shadow” and “filtered shadow”. They represent very different things in computer graphics.
SSS, Eevee can do SSS using the Cubic or Burley algorithm, but does not do the cutting edge Random Walk technique that just recently found its way to Cycles
Volumetrics, last I read, Eevee needs a sort of domain to do this type of effect. What I meant is creating a mesh and giving the material a volume property together with surface shading like with Cycles.
True soft shadows, see the wiki article on how Clement couldn’t put together a solution where the shadows are dependent on light size and gets softer as you go further from the object casting them. Though Eevee could theoretically do them during F12 rendering where realtime feedback isn’t critical (but it will be a bit slower).
My intention was not to suggest that EEVEE was giving same result as Cycles with same techniques.
But to explain that like Unity, because focus of interrogation was comparison between realtime engines,
EEVEE will use other techniques to satisfy similar needs.
It looks like density attribute is no more supported. So no, it is not even possible.
I shared a render, months ago about a smoke simulation.
And I stayed on this impression. It is no more possible with today’s build.
But I have no doubt ; it will again be possible in the future.
If such simulation will be handled; a mesh defined shape of same precision will be, too.
It is OK to inform people that these effects will have lots of limitations and an effort/quality balance not always satisfying.
But we have to inform people that EEVEE will not just be limited to space screen reflections.
What I was meaning is that these are limitations that are in Eevee as of right now. I don’t have the development plans for Eevee from now till release so there’s always a chance that some of them could be relaxed or even eliminated (but having total parity with Cycles in terms of what it can do is unlikely for the near term).
We will support pre-rendered HDRI followed by in-scene on-demand generated probes. This makes the scene objects to influence each other (reflections, diffuse light bounce, …).
We need the viewport to always be responsive, and to have something to show while probes are calculated. Spherical harmonics (i.e., diffuse only) can be stored in the .blend for quick load while probes are generated.
Time cache should also be considered, for responsiveness.
Can’t he randomly move the point light source within the light area each sample and accumulate the shadow like the preview render in Substance, or one of the modes in Mitsuba do? I don’t see why this wouldn’t work for the viewport, except that the shadow would momentarily turn sharp when something is changing on screen.