Micropolygon displacement cycles?

Any plan to implement this any time soon? would be a nice feature.

EDIT Oops should of posted this in blender & cg discussion.

yes, we need this feature!

It is on the roadmap, along with vector displacement handling, which will be a potent combination in making Cycles a production ready renderer. As for when it will be implemented, that’s anyone’s guess. I do hope it’s sooner rather than later, however.

i think cycles will need a lot optimization so that it can use mesh subdivided to pixels. yes adaptive displacement is good. you dont get displacement with objects in teh back. but when you have displacement in a close up model you still have a lot poly’s.

a raytracer needs to be able to render millions of polygons to have good displacement.

ok endi…we get it…some people have a different workflow…
I would like to see it as well, but it will be some time…

some reading for you…the roadmap is a bit old now,as far as cycles development is concerned, but still worth while to check out.

Okay thanks, I think that answers the question.

umm cycles already has mpoly displacment but is experimental and buggy,
in render properties, change supported to experimental,
then select your object and go into its data properties,
change its bump type to both,
and turn on use subdivision and set the dicing rate at 0.01,
then plug your bump maps into displacment socket

that object is just a cube

That isn’t micropolygon displacement, and in fact isn’t even really different from normal displacement except that it doesn’t show up in the viewport. Micropoly displacement is an algorithm that subdivides in an adaptive manner to automatically create necessary geometry based on a displacement map. The dicing algorithm in place now really isn’t much different from setting the ‘Render’ number higher in the SubD modifier.

How about reading a book, i know what mpolies are and what you say is not what cycles does


Besides the tech talk what would be the speed impact of REYES like micro displacement with Cycles current path-tracer approach?

Then how about you tell us what you think Cycles dicing does, because from what I’ve seen and read, every tenth you subtract from one takes the place of one subdivision at render time when building the BVH. If it does something different, I’m all ears.

Does it make adaptive subdivision base on the camera distance?
From what i see, in the implementation it attempt to uses DiagSplit, not sure if it being used.

The wiki states that it is not currently camera based or in any way adaptive, but that they plan is to use adaptive Gregory Patches or something similar in the future.

An adaptive camera distance related approach would be very welcome.

what is in cycles is very crude, but i think its somthing like thishttp://www.cs.utah.edu/~bes/papers/height/paper.html which luxrender uses http://www.luxrender.net/wiki/Microdisplacement, granted its not ‘true’ mpolies but pretty darn close

this is already a pretty good result. as long as this does not impact BVH build time and RAM consumption it will be cool.

in the beginning it will take a lot time for the BHV and RAM. then blender will find a way to optimize it for blender.

i think in vray the mesh is only subdivided in the bucket that renders the image. is this true?

The unfinished code in Cycles is a bit like REYES, subdividing polygons adaptively according to camera distance. This will have an impact on BVH build time and RAM usage.
The code in Luxrender on the other hand subdivides the polygons on-the-fly during intersection. This has no RAM/BVH overhead, but it is slower to render.

Why not parallax mapping instead?
It’s a very quick and dirty way to add details, like super normal mapping.

Otherwise volume rendering will work much better.

Parallax Mapping has terrible artifacts around seams and steep angles, and the quality is very bad in general. Apart from that, you can’t really use it to compute the radiance inside the parallax volume. Any volumetric solution that does this is likely to be slower than the micropolygon variant.