I saw in the 3ds max 9 demo videos they have a new mental ray shader that can ‘fake’ bevelled edges by altering the normals like a bump map. It’s hard to make out much in the video, but it sounds like a very clever idea. Demo is in the ‘mental ray improvements’ video here: http://usa.autodesk.com/adsk/servlet/item?siteID=123112&id=7684178
Not easily. I tried this already and I tried soooo many different ways. I needed it to help with edge detection for a toon shader because sharp edges are difficult to pick up when the surface bends away from the camera. I figured out a decent way to do it. It’s probably the same method as the normals adjustment. Basically, I exported the normals from Blender and then compared the phong interpolated normals to the standard surface normals to detect the edge position (Renderman deals with patches you see and you don’t want to bevel internal edges). Then I just used a condition to decide which edges to bevel and by how much.
These are all subdiv cubes creased by varying amounts (0, 0.25, 0.5, 0.75, 1) rendered in 3delight with an auto-bevel displacement shader:
The trouble is it gets artifacts for unclean geometry because sometimes the normal variation isn’t enough to detect the edge properly. This is a limitation with REYES actually because it only gets access to the currently shaded point so it doesn’t know about adjacent points. The surface normal variation was about the only info I had access to that was useful.
Blender would be able to do this much more easily because it always knows where the edges are. I think an auto-bevel tool would be great. I’m not so keen on the idea of faked normals for high quality renders because you are limited to micro-bevels but until Blender gets true displacements, it might have to do. Of course, if it could bake a displacement map so that you could render in say Renderman then that might work ok but you’ve got to worry about texture stretching and things.
With Renderman, it’s actually not that much different. With micro-polygons, displacements don’t really take that much longer to render than normals adjustment other than when you have complicated lighting because naturally, it has to take into account the new geometry and generate complex shadows. But if you have soft shadow maps, Renderman by default ignores the displacements unless you specify a true displacement option. Of course the same certainly isn’t true about Blender’s displacements.
I feel your pain, but as they say, tool is just a tool and the outcome depends how well you know to use the tool.
I didn’t finish the “bevel” shader as this is not my priority No 1 at this moment, but I can show you a quick result which proves it is pretty simple to do correctly.
[Hint: look at Soccerball displacement and surface shaders. This might not be a standard way of doing things, but you get very precise results. Since you know the vert locations of your object, you can include it right into your shader… and the rest is history].
The shader calculates the distance from the line (Pi, Pj) and if it’s bigger than 0.2 then it extrudes the face up. i haven’t written the smooth translation, so I left it transparent.
I don’t think that helps because you need to know which edges are sharp. You therefore need to be able to test the discontinuity of the nearest edge and Renderman SL doesn’t let you do that. A simple method would be to check what the normal direction is on one side of the edge compared to the normal on the other side because not only do you need to know when to bevel an edge but in which direction. Internal edges are bevelled inwards as you can see in the second image I posted before. But you can’t calculate the normal anywhere except the current point. This is because REYES doesn’t like to store all the geometry in the scene in order to save memory so you don’t get access to it.
You can get that result by just displacing the points along their normals. You also need to test on cubes that don’t just have 1 polygon per face because that’s where you get problems with internal edges. One solution is to use custom UV parameterization but it’s a pain in the ass because it would be like UV mapping your model twice and Blender doesn’t support multiple UV maps anyway.
Displacing the faces is not an option because you are changing the shape/size of the model. Any bevel transforms should only affect the edges. You would find it diffcult to do the smooth translation that way. The samples that I rendered above displace the points near the edge down towards a calculated spherical centre based on their distance from the edge.
I had limited success using vertex coloring to determine which edges to bevel but Blender didn’t export colors very well so it got artifacts plus on faces with only 1 polygon, it interpolated the color across the face so you had to subdivide all those faces.