Parallax mapping for Blender Render / Cycles?

Yes, I know this is normally a feature found in game engines, and I think I already seen people who got it working for BGE. I’m interested if this technology can also be used in Blender Render by default. The reason is that from my knowledge, this can help a lot with small details which would take too many polygons to make part of a mesh. Games benefit from this exclusively, but renders and animations could use the feature just as much.

If anyone’s not familiar with this: Parallax mapping (or displacement mapping) is similar to bump mapping and normal mapping, but also has the ability to offset geometry. A bump map (already easy to use in Blender) can make lightning seem brighter or darker based on a texture depending on the angle of the light, basically using luminosity to simulate a depth effect for little details. Parallax mapping does this as well, but takes it a step further by actually offsetting the geometry without adding extra vertices, which allows for small details to be 3D-fied. So if you have a texture of a rock composed of little pebbles, a parallax map would allow you to see each tiny stone in 3D when looking from up close at any angle, just like it was a million polygon mesh. Comparison between bump maps and displacement maps:

http://i48.tinypic.com/10nt10n.jpg

http://i46.tinypic.com/33v2rtz.jpg

Someone already seems to have achieved this in Blender, but it might be the game engine and not the renderer.

Is this possible in Blender Render and / or Cycles with either nodes or material / texture settings? If not, do the developers plan to implement it, and does anyone else but me want them to? Again, I know this is commonly used in games, but I believe renders could use it just as much for top quality and detail… I’d totally use it for mine.

Attachments


Ok, I HALF found the answer to this question. In the Texture tab under Incluence, somewhere under Normal you can enable the Displacement option. However, this is not the king of parallax mapping I’m looking for. What it does is offset existing vertices during rendering. Real parallax displaces geometry without adding extra vertices (IIRC it’s done by the GPU), otherwise the whole point of avoiding a million vertices and not losing performance is missed. Technically you should be able to use it on the default flat plane of only 4 vertices, detail depending only on the geometry of the displacing texture.

Here’s another video of someone who got this working in the 3D view with nodes. He doesn’t explain the node setup however.

It is built right in to Renderman specifications. You can actually render this kind of effect in 3Delight, Aqsis or Pixie. The problem with Blender’s internal implmentation is that to get decent displacement you have to have a dense mesh to displace. While the Renderman approach is a material/shader approach which displaces the mesh at render time (so you can’t see the displacement in the viewport).

Yes. Like I said, correct displacement mapping doesn’t need extra vertices on the mesh, and the mesh can be a flat plane (4 verts). Each pixel of the displacement texture defines the resolution IIRC. This shouldn’t create extra vertices at render time either. The actual mesh stays unaffected, as parallax is only an effect (in realtime I think the GPU does it). If it would subdivide the mesh rendering would still take a lot, and again you’re better doing it manually with a huge-poly mesh.

I love this technique I am also awaiting for it for a long time but nothing yet…

Also for some that say that parallax mapping is not good then check “Steep Parallax Occlusion Mapping”
https://www.google.gr/search?sugexp=chrome,mod%3D17&q=steep+occlusion+mapping&um=1&ie=UTF-8&hl=el&tbm=isch&source=og&sa=N&tab=wi&ei=a0zVT-P1EfKa1AX8mdGWBA&biw=1123&bih=908&sei=bUzVT-2wOejW0QW1oamSBA

As far as I know these things are possible:

  1. Wait for the Voxel BDSF
    Though some people complain that it’s extremely slow, but it’s a possible solution, gives amazing results. I had seen a sponge object rendered in Cycles having all these thousands pores and it was great.

  2. There’s a thing goin’ on about displacement mapping support but it’s still experimental
    http://wiki.blender.org/index.php/Doc:2.6/Manual/Render/Cycles/Materials/Displacement
    Also we will have to consider the roadmap of cycles in order to get a picture of things to come
    http://wiki.blender.org/index.php/Dev:2.6/Source/Render/Cycles/ToDo

  3. Wait for Open Shading Language implementation
    Things are going to get wild after this, imagine what crazy shaders people could write. Every eyecandy you see on Siggraph could be brought to Blender. I am not into technical details but as far as I understand it’s a way of like writing GLSL shader but for Cycles (not OpenGL) http://www.foreverblender.com/2012/06/status-open-shading-language.html
    P.S. Correct me if I am wrong here.

If you have more information on this subject then please say something.

Thanks for the info. 2 sounds like the best option, since I’m seeing it’s already on the list at least (and partly in progress). I hope it’s still being worked on. Voxels are something I’m separately hoping for too, but for displacement I wish for a correct implementation. And sure, if I find any news I’ll post it here unless I forget this thread :stuck_out_tongue:

PS. I’m seeing that link is for the Cycles TODO list. Won’t this exist for Blender Render too?

The internal renderer has been around since the beginning of Blender and people all these years almost made it perfect. Now it’s very stable and and mature and it works fine in most rendering requirements.

But when it comes to adding new features + have room for changes and improvements then Cycles seems more logical approach, because it can be shaped easily. While breaking the already internal renderer seems worthless tactic.

P.S. Eventually, I get the feeling that the internal renderer is silently depricated (in long term) because Cycles will turn out to be more attractive.

I still use Blender Render… hope it will never be deprecated :confused: I wish Cycles wouldn’t have been a separate engine personally, but that the internal rendered would have gotten its features instead, so things wouldn’t have been so complex and picky. Still, I hope this feature can somehow be added to both render engines… I don’t think it would harm or break anything.

Cycles could never have been added to BI. They tried. It was called the render25 branch. It crashed and burned, hard. BI is beyond help as far as extensibility to support modern rendering goes.

I understand. My biggest issue with Cycles being separate is that it doesn’t work with existing renders and architecture. If you wanna render something you created many years ago in BL under Cycles, you must manually re-do all materials / textures and possibly lights. I heard there’s a script to export to cycles, but not even included by default with Blender. I wish at least that could have been helped, so changes to the scene to render something from BL in Cycles would have been optional (and only for enabling new Cycles-specific features). Still hoping this might be fixed somehow so the choice factor when creating your scene can be decreased.

There’s really no way to auto-convert a BI material into a Cycles one. The two have completely different material fundamentals, and more often than not a setting in BI doesn’t even have a counterpart in Cycles. It’s just the nature of the beast. It’s completely unreasonable to think that you should be able to take a Lambert/Phong-based system like BI and have it work in a Physically-based BSDF system like Cycles. It would be like asking why you can’t automatically convert a 2D image into a 3D flythrough.

I know it wouldn’t work the same way, since Cycles and BI are very different. The main thing I wish would have worked between the two is at least simple texturing. So if you make a mesh and give it a material with one diffuse texture, at least have that texture stick through both engines. But I’m not good with Cycles yet and others know better why this happens. Still makes things a bit hard, especially if BI might eventually get deprecated (and old Blends would no longer be compatible a few years from now… ouch).

hi all, yes its possible but the result is not very clean. I found a file made by zelouille in a blenderclan thread name “parallax mapping with offset limiting” with node, the shader is a bit complex but work very well in glsl mode since 2.5, because you can apply a normalmap as input materialnode.
I modified the file for having a silhouette clipping, and adapt them for cycles but i’ve got a problem with my tangent object in viewspace.if you want more information type "parallax node blender " in google image search, i think the first and second image will interest you.
ps: sorry for my english.

Attachments




hi all, yes its possible but the result is not very clean. I found a file made by zelouille in a blenderclan thread name “parallax mapping with offset limiting” with node, the shader is a bit complex but work very well in glsl mode since 2.5, because you can apply a normalmap as input materialnode.
I modified the file for having a silhouette clipping, and adapt them for cycles but i’ve got a problem with my tangent object in viewspace.if you want more information type "parallax node blender " in google image search, i think the first and second image will interest you.
ps: sorry for my english.

Sorry for the late reply. Those look very good, just what I was hoping for! Sucks that the node setup is so complex, but thankfully node groups exist for this. Can you post the Blend file as well please?

Any more news on this? Still looking for a blend file with that shader setup, or a screenshot or instructions on how it’s done

Still looking for a solution to this, and hoping it might get added if it’s not already. I could really use displacement mapping in most of my works, and feel rather limited without this feature. I heard it might already exist in Cycles (curious about that too), but any news for BR?

http://www.blendpolis.de/viewtopic.php?f=14&t=42940#p465811
Found!


I finally found the solution for node-less parallax mapping in Blender Render. Yes, such a thing exists. The answer was in front of me all along, and it’s something even a beginner should know.

Under the Textures panel in the Influence section, you have the Geometry value which is used to enable plain bump mapping. Under that there is another value called Warp, which uses that geometry to also deform future textures. I only discovered it some weeks ago, and have been using it without realizing what it actually is. Today I suddenly realized that it IS parallax. Perhaps a slightly different kind of displacement, but close enough still.

Unfortunately it only works for Blender Render, not the game engine. No idea why Textured view in GLSL mode doesn’t handle that as well, like it does with bump mapping. Anyway this answers my question for the renderer at least… for BGE I can use the node setup posted here previously.