Node materials for multiple render engine.

Hi all.

I was thinking the other day, and wanted to ask some people more clever than I about a project I have been thinking about looking into.

Most often different render engines have different material definitions. Yet many of these engines defines their materials using nodes.
Do you think it would be possible to create an interchange material format describing the basic node setup. It would be possible to define simple node keys like: Diffuse, glossy, transparent, glass, metal etc.

Then make a map with all the different nodes for each engine: glossy => [cycles:glossy, lux:glossy2, xyz:shiny etc]
(note they may be named differently, i havent checked yet)

Then Blender or any other app I guess could take the interchange material and substitute the required nodes with its own equivalent and set the color inputs etc.

I know it is more complicated than just that, but would anybody know if this is theoretically possible and what would be the main problems?

Roger over.

PS I originally wanted to call this thread: “Cross render engine material node noodles compatibility format” :smiley:

Perhaps this may be possible once we have python nodes in the node editor, or perhaps OSL is a better way to go about this.

I’m not sure how many engines support OSL at the moment. I think it is only V-ray, Cycles, and Arnold. But I’m sure this will grow in time, and then it will simply be a case of writing one glossy shader to run across all engines that support OSL, and no need for conversions.

Perhaps in time we, the community, can develop a script that auto-generates OSL shaders from Cycles SVM node set-ups. That would be an excellent way to develop high-quality shaders without any coding for a multitude of engines. Blender/ Cycles could even become a Slim-like shading editor/ creator for OSL shaders.

I see what you mean. OSL is a great coming standard. [Cycles nodes] => Array[OSL shader per node + relationship] => Any other node material setup.

I guess reusability between render setups is one of the reasons behind OSL.
But it would be so nice if it could be structured through nodes as they are much simpler to manipulate for many users.

Then Blender or any other app I guess could take the interchange material and substitute the required nodes with its own equivalent and set the color inputs etc.

There’s much too little overlap between different renderer features to make this work reliably and I doubt people are keen on developing something they know isn’t realistically going to work well.
Also, the chances of getting a new standard adopted are close to 0.

But it would be so nice if it could be structured through nodes as they are much simpler to manipulate for many users.

There could be Node Editors that generate OSL.

Sounds interesting. Would solve the issue for all engines that support OSL.

I havent looked into what features overlap but you might be right. I heard the lux guys are thinking of improving their material system and that was what started the thinking.

Hmm… doesn’t seem that hard at all.

With python duck-typing you’d just have to have different classes that output the translated node(s):

lux.glossy()
cycles.glossy()
xyz.glossy()

then walk the node tree you’re translating passing in the renderer class:


class nodetree()
    def export(renderer):
        for node in self.nodetree_walker():
            if node.type == 'GLOSSY':
                out_tree.append(renderer.glossy())
            elif node.type == ...

Easy peasy (though I wouldn’t use a big chain of if/elif statements myself).

@uncle entity:
Smart. You would need a failsafe though if the given node does not exist in another the render engine.

Why would Blender need an interchange material setup if the Pynodes are likely to come in for 2.67?

Some of the reasons why they didn’t make it in for 2.66 is because of needed changes to the node architecture that had to go in first and the fact that Lukas Tonne fell ill shortly before Bcon3 started. It’s not a trivial thing to add and I would think they would want a much better implementation compared to the old Pynode implementation in 2.4x

Check the 30 second point in this video:

In the in-development Octane integrated blender plugin an octane material appears in the node editor.

Yes, it would be easy translating a bunch of nodes, if there was an interface honored by every renderer (which is the hard part).
As it is, external renderers could already parse the graph themselves and do the appropriate translation, if they wanted to be compatible. In practice, they will likely not re-implement all of Blenders features, yet will be adding their own features on top, so you don’t have a compatible system.

@Ace Dragon: It is not so much a question of defining node setups in Blender, although I look very much forward to the pyNodes as well. It would be used to transfer materials between node-material engines, so I i.e. could create a material for cycles and then tranfer it to Yafaray or Luxrender.

I know it would be hard if even possible for complete coverage of all nodes and some render engines probably have to different a system.
Yet I would like to compare which nodes are actually in common between different engines. I think a google spreadsheet could be a nice starting point.

EDIT:
Another personal reason is to just get a picture of what basic setups are available across which engines. I am goin to an office soon where they use max a lot and I will probably have to recreate all my materials.

I think you aren’t understanding the meaning of ‘translate’.

If renderer X doesn’t have a certain type of node then you build the functionality out of the nodes they do have. Chances are the whole system would be based on this since a direct translation probably wouldn’t be possible anyway due to default values and different implementations of the same functionality.

I’m not talking about the merits of such a system (no opinion really) just that it isn’t that difficult of a problem to solve.

Well, obviously, but you still won’t get around reimplementing Blenders functionality that isn’t expressible in terms of nodes, in the same way that you cannot express any given external renderer feature in terms of Blender nodes. The evaluation of shaders (and which data is available to them) can be fundamentally different across renderers, as well.

I’m not talking about the merits of such a system (no opinion really) just that it isn’t that difficult of a problem to solve.

For it to even be possible, the renderers would need to be architecturally similar (e.g. path-tracers) and then you’d need people to agree to adhere to the standard (which is unlikely). You’re just highlighting one particular aspect of the problem that isn’t difficult.

Well of course all you support is the staples…colour/spec/ior/transparent/roughness/texture/bump map/normal map/emission/
Luxrender already did this with “convert blender materials to luxrender” but they didn’t bother try and convert something from blender into some of luxrenders advanced technical textures.

But more importantly and semi related currently it doesn’t seem possible to use different node setups for each render engine? IE Blender material nodes and separate cycles material nodes setup for the same material

There seems to be a misconception here: People seem to want to see in renderer X the option of Blender which does not make any sense at all.
When we use a certain renderer with Blender it is because we want functions available in that particular renderer.
It is impossible to move away frow having to know how that renderer works or we’ll trod the “translation” road which offers a VERY small set of features as opposed to the proper node integration.
ShadermanNEXT , which is written entirely in Python, has an interesting take on that: It defines Modes which are folders containing defintions of nodes partaining to one field (rendermanSL, particules, image processing, even Unix command!) in xml format. When launching ShandermanNEXT it reads those folder and automatically populate lists with what it finds in those folders.
When you create a noodle in RendermanSL mode for instance, the software generates RSL shader code that can be copied and used anywhere else.
It can target a lot of different things and could serve as inspiration for node integration for Blender.
Luxrender, Yafaray and other rendering engines would only have to maintain a master folder for each renderer with folders containing nodes definitions. When you choose a renderer in Blender, it would read the according folder and populate the list of available nodes for that renderer. A mechanism for refreshing the current list would have to be considered too. If a developper adds a new function to his renderer all that would be needed would be to add a node definition in a folder and restart Blender or refresh the list.

But more importantly and semi related currently it doesn’t seem possible to use different node setups for each render engine? IE Blender material nodes and separate cycles material nodes setup for the same material[/QUOTE]

The modal way of Blender luckily prevent that. When you choose the renderer it shifts ALL Blender accordingly so that any part will work with the renderer. The side effect is that it is new user friendly but it can be a PITA when an advanced user knows exactly what s/he wants(your case here).
One way of solving this could be Output renderer nodes that pipe datas to specific renderer opening the way for easier compositing but probably a nightmare for developpers.

@DeMack Very interesting thoughts.
Actually very close to my own, yet you seem to have a better idea of the technical requirements.
My initially thoughts behind this was the different functions of the renders. I usually do my light studies in Luxrender as it does the most realistic lighting, yet my computer is not nearly strong enough to do noise free lux renders.
Instead I do a artistic composition in Cycles yet I have to redo all the materials each time even if they are only simple diffuse, gloss, bumb etc…

My thoughts on an application would work much as you describe ShadermanNEXT to work.