Does anyone else think Blender's texturing needs revamping?

I have recently been thinking about the methods that Blender uses for it’s procedural textures, from the point of view of making realistic materials, and it seems that there is room for improvement.

My thoughts are these:

For general use, (noobies) it would be easier to have a set of preset materials to choose from. This would allow more complex and realistic materials to be used straight off, as they would have been created by A.N.Other, rahter than by tweaking the specular and the colour of the default material.

All materials should really use a node type system.

The node system should allow any meaningful variable in the procedural texture to be modified by the output of another texture. Thus you can cause the wood rings in the wood texture to move aside to allow a ‘knot’ from a voronoi texture to poke through, or a cloud texture to vary the depth of noise in another cloud texture.

There needs to be the option to vary the probability of a texture feature to occur. There also needs to be the possibility of mapping an imported texture, such as a photo or painting, or another procedural texture, into each feature of another texture.

What got me thinking is that to carry out a photo realistic texturing of an everyday object is extremely hard. Take a table. What makes it seem real is the day to day wear and tear, - or scratches, in other words. However, if we had a table of 1*2 metres, and we wanted to paint an image for the scratches, assuming they are 0.1mm wide,we would need a minimum image size of 10,000 by 20,000 pixels. Given that we might want scratches at angles, to avoid jaggies, we would need ten times this. So, it is totally impractical to use a painted map.

We could use a small map and repeat it, but then we would get a tiled effect.

So, what we need to be able to do it take a small image of a scratch ( or a procedural that mimicks a scratch), and drop it at random all over the object. (It would be better to use a procedural if possible, as then we wouldn’t have to UV map around edges etc.) Likewise, we could create a texture of a ‘pit’ of rust, or a ‘flake’ of paint, and repeat these at random. However, we don’t want to just make them equally random all over, then fade them out where they are not supposed to be. Paint is either flaked, or not, it isn’t flaked flatter, niether is rust pitted smoother, etc. It either is or it is not. Variations in depth do occur, but not necessarily related to where the effect is. Thus we need to vary the probability that a pit, or a flake, or whatever, will exist, dependent on the underlying texture. Paint tends to flake with cracks along the ridges of the wood grain, for example. Rust starts with a few spots, and then they merge into one another…

Another possibility would be micro-distortion of the surface, so that textures can alter the underlying mesh without it having to be sub-divided. This would allow a brick texture to make the edges of the object truly rough. This is another order of magnitude more complex, of course.

I’m not going into the details of implementation here, just trying to get an idea of what others think, and other ways this could be added to, before thinking about putting together a proposal to the dev team.

Matt

I’ve had the same setbacks in Blender for a while now. I cannot, however, think of a good system to implement a more procedural texturing idea like you’ve proposed here. It may already be possible to do this with material nodes, but material nodes have proven to be very slow when rendering, and this is noted in the wiki. It may be possible to do this with texture nodes already too… but unfortunately the texture nodes don’t get passed through any sampling, so they render very noisy at distances.

I think what you might be proposing is comparable to Softimage’s material node system. I like that system and I believe it can be done in Blender fairly easily, but slow rendering and bad sampling can’t be overlooked this time like they have in the past.

Some weathering techniques can be done already with many UV layer’s which control the placement of stencil textures (containing alpha, mapped with “clip” on)… this is a lot of work for a single object though. Perhaps the ability to extract the curvature and sharpness of geometry and output it through material nodes would work for these stencils. I tried to replicate this using math and a few techniques I’ve found around the internet, but without having access to local object normals it’s impossible.

One procedural technique I’ve come across is to take vertex colors, paint weathered areas white, all others black. In mat nodes, multiply a clouds noise over the vertex colors, round the output with math, and use that as a stencil for rust. It works fairly well, but again, it uses nodes, so there is an impact on render performance.

Cheers,
John

Preset materials/textures make for a very boring newbie gallery:p
On a more serious note, having to tweak the materials yourself is actually a pretty good learning experience. That and there’s pretty good materials available for download if you want something specific. A good way to make a quickly accessible asset library would of course be a desirable feature.

Blender already has a node based material system, and if I’ve understood it correctly, the goal is to gradually move from the old material editor into the new system…

The features you mention are of course pretty interesting, but it is, as always a question of priority, where some stuff is more urgent than other. Still, maybe someone will be interested in coding something like this (didn’t someone already have some kind of prototype for a cavity shader?).

If blurring is available from the material node editor, then I’ve seen a technique that simply blurs the object’s normals, and does a dot product with the original normals to produce a valid rust shader. It’s not a cavity shader… that would require calculating the mean curvature of the surface I think.

Sorry if this is a bit off topic, the OP is talking about realism and weathering which is what I’ve tried to seek with Blender for a while now.

Cheers,
John

I certainly agree there is a great need to have material presets in place.

It would significantly speed up the learning curve for new users (nobody likes to have their renders suck for 3 months).

Since Blender is about efficiency and speed this would be very valuable even for advanced users. Blender is HUGE!!! if we had just a few basic presets (smoke, glass, clay, plastic, skin… etc…) that time saved could be put towards other things. There’s certainly no shortage of things to learn in Blender.

This doesn’t mean people won’t learn how to “roll their own” materials, it would actually guide them when making more advanced materials.

1- There is already a project to make a library of predefined materials
for 2.5 but been delayed i guess
and don’t know when it will be integrated in 2.5

should have something like 1000 differents mat + some text

2 - there is in render branch a new type of texture called micro displacement which should
be included when we get the render branch integrated
but when this will happen don’t know !

3 - one problem which has been ask many times for procedural textures
is how you can map it to objects

here i would suggest to show the object and the mapping in a sort of equivalent UV editor
but for procedural texture where you could change the scale location like we have in UV editor for images
instead of using the values from the Map in panel
mind you theses vars could still be there for working wiht when needed

that would make it a lot easier for noobie and eveybody else too to work with procedural textures

4- it might be good to have a few other new type of procedurales texturex
would give more flexibility and choices

so in general like you said there is still room for improvement in this field of texturing in blender!

happy 2.5

Its impossible to say no to improvement and its impossible not to appreciate and be greatful to the revamp that 2.5 is for Blender as a general. I still cant believe how much the sculpting tool has imporved. I am sure painting tools will be next.

Most of the things you propose are already implemented. There’s a node system for materials and textures. And even if you don’t use that, the texture stack allows you to combine multiple procedural textures. You can use one texture to blend or warp another, you can merge coarse detail with fine detail so that no matter how much you zoom in, the surface never looks blurred. And of course you can mix procedural textures with bitmap textures.

I guess what you are really looking for is a good tutorial explaining the texture stack with examples.

Zapperjet; yes, there is a speed tradeoff. Obviously, there always will be SOME speed tradeoff, but there should be some speedups that can be applied. The problem that I see with the existing node system is that it has very little complexity possible. Many of the stack manipulating systems possible via the normal materials menus cannot be applied via the nodes system, as far as I could find. We need a better system than the present.
Jorzi; the noobies already produce really boring materials. Surely it’s better to have at least realistic boring materials? :slight_smile:
I have certainly considered that there should be a way of ‘fading out’ a texture to an average as it recedes into the distance, to stop the ‘noise’ effect that appears in many textures that become sub-pixel.
Cominandburn: Agreed.
Rickyblender: I had heard rumblings about a material library. I did read something about the miro-displacement, which is what made me suggest it, though I understand it has been shelved temporarily at least.
eye208: I know about layering textures, and about warping etc. but these are still limited and previewing is not only limited but inaccurate, as scaling is not always taken into account.

What brought some of this to mind is the texturing system in Luxrender, which still leaves a lot to be desired when trying to weather something, but allows far more complexity to be built quickly.

It would be useful to have some feedback from people who have already tried to code any of this. I can code, but I haven’t done any Blender work.

Matt

IMHO we need a lot more node capability in texture nodes, eg blur, glare, etc.

One recent addition that seems to be overlooked is that the problems with speckle noise that drives you crazy when you animate over detailed image textures can be controlled using filter size in mipmap within the image sampling of the texture panel. Unfortunately, you don’t have this option for generated textures so the pixelation artifacts can’t be blurred out. Arrgghhhh!

I didn’t took me that long to find out how to get the materials i was wishing, using Blenders way… Yet, definitely, some presets would have helped (just like tutorials for other stuff still do). Things like glass, water, metals, convincing woods, aso…

You might find this interesting.

http://graphics.cs.kuleuven.be/publications/LLDD09PNSGC/

You could solve your issues in a more proactive manner that does not involve developers who might be pressed for time

  1. render engine( first license is free of charge even for commercial work)
    http://www.3delight.com/en/

  2. exporter
    http://sourceforge.net/projects/ribmosaic/?showreview=everything

  3. reading material
    http://www.amazon.com/RenderMan-Shading-Language-Guide/dp/1598632868/ref=pd_sim_b_3
    http://www.amazon.com/Texturing-Modeling-Third-Procedural-Approach/dp/1558608486

with those resources you have the power to write you own custom renderman shaders and use them in a professional renderman compliant renderer. You are more likely to get what you want if you take this approach rather than with a feature request thread which are ignored 99% of the time.

I wan’t really expecting the devs to write the code for the idea. I would either try personally, or get someone else to help. The point is to try to bring Blender’s textuing system up to a higher level, so that it can compete with some other rendering engines.

With all the improvements in Blender recently, as regards UI and tools, it would be a shame if the end result was let down by a lack of good enough textures. Over and over I seem to see people using Blender to model something and Lux or yafa or whatever to actually render. Partly because they support extra things like caustics, and partly because they have better texturing. Blender can do a lot, but it is very unintuitive and complex, and sometimes it seems like you still have to use the basic textures in place of nodes because nodes cannot do the same things.

Speckle noise should be easily removed by fading from textured to an average specular as a texture reaches a certain pattern density. (Perhaps easily is not the word, as nothing in rendering maths is easy!)

Matt

dude lux and yafaray material systems are far far more limited than BI. the are no where nears as flexible as blenders and I think you would have a harder time trying to achieve what you want with them than with BI.

eye208 kind of has a point most of what you want to do can be done in a way already and if you want more control than I would say a renderman shader with programmable shaders will get you futher than lux or yafaray.

I have tried 3delight, few yeas ago, and i suppose that the other solutions have the same implications: you’ll have to have all your meshes free of any materials allthrough the process and create all the textures and materials within these renderers… Difficult to do when you create the whole thing by yourself, at least for me. I need to see the end result for every character or decorum during the creative process, in order to go on working along on ajustments step by step… :eyebrowlift2:

Ty,

strange idea that the way of improving Blender would be to use another program instead. A bit like when asked ‘how do we make the brakes on our Ford work better?’ answering, ‘drive a BMW’.

Blender texturing is not easy. Even a half-realistic wood takes five or six layers of textures, goes really speckly as soon as it recedes into the distance, and cannot mimick separate planks without modelling them separately or adding separate materials with yet more settings. I have yet to see any photo realistic output from Blender Internal that wasn’t essentially either a photo or a painted texture.

If you open a material in the material node editor, you cannot actually edit the material. Likewise with the texture editor. All you can do is connect them together, - you still have to edit them the old-fashioned way. Tried it last night and it’s still true in 2.54.

It should be possible to open a material in the node editor, change settings, drop down the textures used, edit those settings, etc. all whilst seeing a preview that is scaleable to the size of the object you are modelling, and that you can zom into. Many of the plugins should be made available in the node editor as well, so you can break an object into checkerboards, or planks, and use the output of these to manipulate the texture space of the existing texture, rather than creating two textures and ‘mixing’ the two.

It should be possible in the node editor to use a colour, a normal, or an alpha to affect the stretching, normal depth, colour banding, or any other factor, of another texture or material.

Matt

This seems to be a re-occuring theme. Far too often when someone makes constructive suggestions towards Blender it’s met with well why don’t you use “x” solution.

I’d hope that people would be more open to these ideas which ultimately just make Blender better.

Bring on the Ideas, there are never enough!

disclaimer
Blender developers retain the right to use these suggestions as they please

I was under the impression you wanted to get work done or use this in some way and for this the are two paths to follow.

  1. you make a feature request, hope it catches someone’s attention. Sit and wait until this is coded or developed

or

  1. uses other tools that are available to you and get the job done.

all that you posted in your first post you could do for your self if you wrote your own render man shaders.
The are export scripts for 2.49, shader writing tools like shrimp or if that doesn’t work notepad will do, and free render engines like pixie, aqsis

blender -> mosaic for export -> aqsis, pixie or 3delight a couple of good books and you are on your way. The alternative seems like a lot of waiting about when you could be having fun.

I haven’t tried 2.5X yet so if my comment is redundant then please forgive. The roundabout way, in 2.49, of applying a texture that you (Or someone else) may have already created is a stumbling block for noobs. I say this speaking as a noob. While learning to make your own textures using Blender’s available tools will eventually make you more competent I suspect that we lose new people who become frustrated by having their early efforts look worse than they might because of poor texturing.

A library of easily applied texture presets might help people to stay on board with Blender rather than going to, say, Bryce. I’m not comparing the two although I would observe that Bryce has sold many copies and that Daz3D continues to thrive. It may not be art but, it probably does lead to it for some people.