What's the relevance of Eevee?

Good luck trying to do stuff like this in Eevee (you can only so so much with probes and the sheer amount of nodes required would be quite hard on GPU’s).

In short, the thread shows examples of advanced effects that would still need pathtracing.

That’s legitimate. I have similar feelings about the updated grease pencil 2d animation. That’s cool and all, but I’m never going to use it. I’d even venture to say that the large majority of users will never use it. I don’t mind that it is there, but if those development hours could have been spend on something that improves my situation, I’d be even happier.

But developers aren’t widgets, and I can’t pull one developer off a project that I find boring to get them working on one that I am really interested in. Martin Felke has been working for over a year on the fracture modifier branch, a branch that has been deemed not worthy of merging to master. While I’m sure Martin is a talented programmer who could benefit the master codebase in significant ways, his passion lies in the fracture modifier. I can’t transplant that passion, especially for a volunteer programmer! I can’t transplant Antonio Vasquez’ passion for grease pencil animation into cycles coding, nor can I transplant Clement Foucault’s passion for real time PBR.

Indy_logic, none if this is meant as a condemnation or shaming for what you said, This is my own experience and thought process.

I’m not getting anything when I do an openGL render, how did you get that working?

It’s safe to say that none of this is final and there are a ton of features and polishing left to do. For now I would try hard to assume the best of the developers and give it time to develop. It’s literally pre-alpha code right now.

Yeah, I totally get it. But actually I just realized that you actually could offer to pay a developer for a specific time period to finish or flesh out a feature you needed. Just food for thought. :wink:

While I am pretty lucky, the nodes that plug into Eevee are the same nodes that plug into cycles. anything that happens before the shader nodes can be plugged directly into Eevee material outputs:


While there certainly is more noise, due to the lack of multiple samples to average out the high-frequency noise map, all the principles are in place. And AA is on the roadmap, so those additional samples might be enough to smooth it out.

edit - a more polished iridescence shader w/ flakes:


Attachments


That has been considered as well, but what happens when your sponsored code doesn’t get accepted into master? If Martin Felke had $20000 in kickstarter donations to code the fracture modifier, and he did it, but the foundation didn’t want to take the risk adopting some code that may not meet the standards of the code base, How do you think all those donors would feel? Pretty pissed, I bet.

It certainly isn’t impossible, but it has some complexities that can be difficult to navigate.

Not to detract from your point, but I should think in this theoretical case, Mr. Felke could go one extra step and make it a plugin. It would likely have to be part of any developer’s thinking if they’re off the beaten path.

Not to detract from your point either, but if blender had a C++ api, it certainly would be a viable option, but generating and animating 1000s of shards via python isn’t really an option. If you wanna hear all about why blender needs a C++ api, track down Sainthaven, he’d be happy to talk about that.

Many months back (when 2.8 was just starting) Martin Felke was to be in contact with the developers as to the changes needed in Blender to implement his patch properly.

I don’t know the status of that as I haven’t seen any serious push on physics yet, but perhaps a few changes (such as with the depsgraph) are being made with its future integration in mind (though you will need to ask a developer if you want accurate information).

Like any good software that is supposed to last you need good structure and code. I think developer Martin Felke would take that in consideration when developing something for Blender.

Most developers are volunteers and work on the stuff they want to. Blender Foundation cannot force people outside of the foundation to work on parts they deem important.

Grease Pencil is an example of this. You may not use it, but its development doesn’t hold back Cycles, since its developer wouldn’t (want to) work on it anyway.

Eevee is another matter, since it is funded and developed by the Blender Foundation. Therefore there needs to be a rationale behind it.

Be able to visualize in PBR in real time will speed up production of 3D assets for video games by a huge margin.

Wouldn’t it be more useful to have Cycles integrated into the viewport? If you put the viewport into ‘Rendered’ mode it essentially renders Cycles but all the handles and symbols for lights, camera and object mode disappear. All one really needs is to have these layered over Cycles.

@ sologix, obviously, you did not see this video.

It is already how it works in 2.8 builds.

Hi.
Just to mention that OpenGL Render Image has been working since the first test I did in 2.8 branch a while ago. What do you mean you are not getting anything? Bad result or really anything?

Haven’t looked into Eevee (why does that remind me of Pokemon?) yet, is it really that good?

When engine is Blender Render or Blender Game-> OpenGL render give me grey values interpreted as alpha values.
When engine is Cycles -> OpenGL render give me grey background from 3View theme and outline overlay, wires, grid, overlays but no geometry or Cycles background.
When engine is Eevee -> OpenGL render give me same result but with a black background. Only overlays without my Eevee scene content.

OpenGL render is only giving a satisfying result with Clay Engine. It is probably what you are talking about YAFU.
But it is not relevant for Eevee.

Not really, I mean Eevee engine and Render > OpenGL Render Image
Does not work for you in this file?:
http://pasteall.org/blend/index.php?id=46899

Currently from camera view I’m having some artifacts in the first sphere (this did not happen in other previous versions), but works well from any other view.

Ok, I think I know what’s going on.
You have OpenGL optimization options in “Render” > “OpenGL Render Options”. Check “Full Sample” option there. There I also have selected “Anti-Aliasing” to prevent jagged edges (like in viewport), and Anti-Aliasing Samples=16. You can also select Alpha mode (Sky or Transparent)
Well, those options have always been there for OpenGL renders.

Wow ! Thank you !
I never would have discover by myself that anti-aliasing option could make a difference between a correct render and no render at all.