Cycles Development Updates

(xalener) #706

Node groups are similar in practice, in that you’re using them the same way, but I think the need for instances are more for performance/ cache size.

I’m not versed in it, but I know it speeds things up significantly for realtime application development.

(moony) #707

Any update on the new features in Voronoi texture? Are they likely to make it into 2.8 beta?

(BlackRainbow) #708

It seems like having a subdivision modifier before hair particle makes hair flicker (as if changing hair seed every frame). No problem in viewport, just on a cycles side.

Edit: It seems subdiv is not causing the flicker. Alembic modifier is.
Edit 2: Confirmed, hair after alembic flickers even if vertex count is not changing. Flicker is not in the viewport. Its in animation render, both eevee and cycles. For now reexporting animation to mdd seems like a workaround

(Charlie) #709

It’s on the patch tracker but I don’t think anyone has looked at it yet. Sergey is away I think so Brecht probably has less time on his hands. It’s a fine balance between pushing the devs and hassling them!

I suppose adding support on the tracker can help!! :stuck_out_tongue_winking_eye:

(Charlie) #710

I’ve also added a patch for a new vector rotation node with sockets!!

(<== Lost? Click Me) #711

Plus the Gabor noise too. :+1:

(Charlie) #712

Yeah that too. Fingers crossed it gets in!

(Klutz) #713

What does that mean? What’s mdd?

(BlackRainbow) #714

Mdd is animation cache file (Lightwave Point Cache) which doesn’t support changing vertex count, but (in blender) is much faster and much more stable than Alembic.

(Klutz) #715

Thanks! Wasn’t aware of that.

(sozap) #716

Awesome ! Thanks !

(moony) #717

Eevee has some nice post rendering - particularly it’s glare effect, which has a much better aesthetic than the Cycles compositor one.

Will it be possible to implement Eevees effects for Cycles renders - or are they totally incompatible?

(SynaGl0w) #718

I really would like to see easy compositing of cycles with passes from eevee. Viewport preview of that would be great as well.

(abdoubouam) #719

Yes. The compositor can do that glare fairly easily, you just have to know the math behind the EEVEE effect to make it look the same.
I’m not an expert and I haven’t tried to implement it myself, so corract me if I say anything wrong, but I’ve read a handful of articles about that and what I’m saying is what I understood from that.
basically how it works:
scale down to 1/2
lower the brightness and filter out the dark areas
apply horizontal blur, then vertical blur (because gaussian blur is expensive)
scale to 2 to get it back to its size, with either bilinear interpolation (fast, can be blocky) or bicubic interpolation (slow but better)
apply it on top of the original render with an RGB mix node.
That’s what I use if i want a soft glow btw. Not the exact parameters as EEVEE but it’s the same principle.

For the DoF, I think EEVEE’s is better than the compositor effect in some aspects. Also how it works from most documents I’ve read about real-time (not necessarily how EEVEE does it, but you’ll get an idea)
First start by taking the Z-depth pass, then extract from it the areas that are too close, and those that are too far (separately)
Apply blur horizontally and then vertically. Possibly many times with varying strengh if the performance allows it. Make sure it doesn’t bleed (I’m not sure if it’s a modifier directional blur that stops when it “sees” a big change in value, or it’s just using the near and far masks created beforehand)
go back to the original image, darken it, and extract areas that should be out of focus and too bright, then with that you can draw individual bokeh sprites (that’s where the compositor can’t work because it doesn’t have loops and can’t iterate, but you can simply use the built in bokeh filter even if it’s slower)
Composite everything on top of the original and voilà

(moony) #720

I know it can - but it doesn’t show in viewport and IMO doesn’t look as good even in final render.

(Lsscpp) #721

having it as an “in camera” lens effect would be great actually. Food for rightclickselect

(Hadriscus) #722

Yeah I can see lens effects generally being possible in-camera more and more in the future, provided it’s still flexible in post I wouldn’t mind it.

(abdoubouam) #723

Now that the render capabilities are a lot better, maybe it’s a good idea to let users create their own fragment shaders? Not necessarily with nodes if it’s too complex to be a priority for the devs, but at least enable to apply a GLSL script on the viewport.
Some examples : vignette, other types of DoF, grain, desaturation, sharpness/blur, Screen-space reflections with another aproach…

(noki paike) #724

the next versions of blender will surely take this path …
both for the fact that with the new gpu eevee will become much more powerful, both because the hardware in general continues to grow … and because at some point the way between the two rendering engines will become so thin that a mixing will come naturally facilitated even at the level of compositing