Glints

http://cseweb.ucsd.edu/~ravir/glints.pdf\


Specular BRDF rendering traditionally approximates surface microstructureusing a smooth normal distribution, but this ignoresglinty effects, easily observable in the real world. While modelingthe actual surface microstructure is possible, the resulting renderingproblem is prohibitively expensive. Recently, Yan et al. [2014]and Jakob et al. [2014] made progress on this problem, but theirapproaches are still expensive and lack full generality in their materialand illumination support. We introduce an efficient and generalmethod that can be easily integrated in a standard rendering system.We treat a specular surface as a four-dimensional position-normaldistribution, and fit this distribution using millions of 4D Gaussians,which we call elements. This leads to closed-form solutions to therequired BRDF evaluation and sampling queries, enabling the firstpractical solution to rendering specular microstructure

1 Like

Yep - there are quite a few papers on this subject. They have been posted on these boards before. It would be nice to see such things incorporated into cycles at some point.

http://people.eecs.berkeley.edu/~lingqi/

Note to associate with an older thread: scratches around relflection (check posts #18 & #19)

Believe it or not, it’s already possible to an extent in Cycles


Of course an official implementation (possibly added to the anistropic node) would have the advantage of better sampling and higher performance, but it doesn’t seem to me like it would be too much of a project for an experienced developer.

Wow. Amazing. How the heck did you come up with going from bumped normal output and using it as tangent? Obviously, for best effect, you should use this on top of other material for best effect, which is another shader call, but a great workaround nonetheless.

Simple intuition based on the knowledge of the ‘relief’ effect that the bump node produces (in terms of output).

In a nutshell, Cycles can do some interesting things if you purposely abuse various features (by connecting them in ways they weren’t originally designed for, something that’s especially true for anything with purple sockets).

Yeah I think I heard some talks about this in the cycles easter egg thread,

I think Ace dragon was the one who brought it up

Funny i was going to post it:

I think the main point here is Performance and accuracy for doing it.

The method takes an uneven, detailed surface and breaks each of its pixels down into pieces that are covered in thousands of microfacets, which are light-reflecting points that are smaller than pixels. A vector that’s perpendicular to the surface of the material is then computed for each microfacet, aka the point’s “normal.” This “normal” is used to figure out how light actually reflects off the material.
According to Ramamoorthi, a microfacet will reflect light back to the virtual camera only if its normal resides “exactly halfway” between the ray projected from the light source and the ray that bounces off the material’s surface. The distribution of the collective normals within each patch of microfacets is calculated, and then used to figure out which of the normals actually are in the halfway position.

Ultimately, what makes this method faster than the current rendering algorithm is that it uses this distribution system instead of calculating how light interacts with each individual microfacet. Ramamoorthi said that it’s able to approximate the normal distribution at each surface location and then compute the amount of net reflected light easily and quickly. In other words, expect to see highly-realistic metallic, wooden, and liquid surfaces in more movies and TV shows in the near future.

Did a quick test of Ace’s node setup in #4. Not exact copy, and also does other things such as plastic spec with weak scratches on topcoat, metallic with scratches etc. So, far from perfect, but still kinda cool to do something that looks like light influence when we can’t really do it :smiley:


It’s just 250 frames ping-ponged a couple of times. Somehow got converted to 480, but still shows the effect.

is it possible to do in real time? this + upbge new cube map reflections…

Usually though, it’s not that easy to translate cutting edge offline rendering technology to realtime without a noticeable sacrifice in quality and physical correctness (an example is the paper on new arealight tech. in Unity, the realtime GGX shading it uses has a noticeable error compared to the raytraced model).

Would like to see something like this implemented as a real shader, with outputs for glints/glitter and concentric radial rings. Maybe some threshold for how much brightness is required before it starts showing (I’m guessing only really bright lightsources?) in order to reduce penalty for an extra shader call. “Proper” roughness wouldn’t be required either I think, since on rougher surfaces the real reflections seem to drown the effect of the glints.
Here is a test I just uploaded when used together with some custom glass shader. Forgot to reduce fake caustics by height, so that looks horrible. Normals are completely smooth, no bumping going on. The glinting effects fades in at the start and out at the end.