Cycles new microdisplacement testing, discussion and blend sharing

(Ace Dragon) #521

The path idea sounds interesting, but perhaps you should first try the new feature that allows you to having the dicing point-of-view based on a camera outside of the active one (the new Dicing Camera field).

Mai developed that feature with the intent to reduce any possible flickering issues.

(Ace Dragon) #522

Done some initial testing with the new “displacement as vector” feature.

The fact that you can displace in any direction now is truly the next step in the march towards ever more detailed scenes in Cycles (a simple clouds texture alone, providing you give the displacement node a slightly rotated normal input, is enough to make something like ocean waves a little more convincing).

Unidirectional displacement can be done as well by skipping the displacement node in favor of plugging in a Combine XYZ node (interesting implications for terrain for starters).

(Lsscpp) #523

Can you provide an example/screen please?

(Ace Dragon) #524

Can you provide an example/screen please?

Red areas denote where the displacement produced genuine overhanging geometry.

For this surface at least, it was far easier to just not use the displacement node in favor of the Combine XYZ node (rotation using mapping nodes can be tricky business). This result is not possible in 2.79 or any version before it.

Also, the cool looking wireframe picture to the right is actually the normal variance debug pass for the denoiser (because it looked good for this example).

(brecht) #525

Taking into account the full camera path for each shot is something that we have discussed before, it’s interesting to try at some point. It is a bit inefficient to fit into render pipelines that usually just consider one frame at a time, and efficiently computing the distance from a vertex to many camera locations is a fun problem, but it’s probably doable.

There’s still cases where it would not help though. Moving or deforming objects, camera and object moving together, long shots that would use too much memory, etc. I haven’t heard of anyone with a good solution to make all this fully automatic, but there are a lot of tricks like this we can try to get closer to that. There’s more urgent things to improve in the adaptive subdivision code though.


The effectivity of (V)RAM usage is so much better now, it´s a pleasure to work with these new features, thanks devs.

some examples:

(Ace Dragon) #527

Microdisplacement now updates in realtime when changed (in the rendered view).

It’s technically categorized as a bugfix, but it lifts a highly noticeable limitation which will greatly speed up the process of iterating for the desired result.

(RevDr) #528

Does anyone know how to put a feature request in relating to microdisplacement? I have discovered something which seems to me is not so much a bug as a limitation (or perhaps I am doing something wrong or lacking in knowledge)

I have scene where I want to use material override on a specific layer. This is usually fine.

However, I want to retain my microdisplacement for this later. However this is different for different objects.

There lies the problem if I overide the shader my “data layer” no longer matches my main render for compositing.

In my mind, the layer material override ideally would optionally give a choice of which material input: surface, shader or displacement to override.

As I have relatively few objects in my scene I am getting round this by creating data materials and setting those to the different objects. However it would not take too many objects for this to be a real pain.

So assuming it is a limitation and not a bug - does anyone know how I put a feature request in?


(erickBlender) #529

Blockquote Does anyone know how to put a feature request in relating to microdisplacement?
Feature request check the link below :slight_smile:

(RevDr) #530

Thank you.

(Bart Veldhuizen) #531

A post was split to a new topic: Need help with Vector Displacement