Cycles materials and the arbitrary roller-coaster scales

(Péter Szilágyi) #1

Hey all,

I’m extremely new to Blender, and am coming from a very different background (software developer, peer-to-peer networking) than I assume most here. Nonetheless, I started working on a material library - or rather a material service - to allow anyone to upload/download Cycles materials. My rationale was that currently there seems to be a mess of material repositories all over, each trying to somehow encompass the entire 3D modelling ecosystem, getting over complicated and less than useful.

I figured that I have the knowledge to put the service part together, I can make it community driven and I can automate pretty much the entire workflow. For example, anyone could just dump a .blend file with a material in it and get automatic preview renders + a minimalist derived .blend file generated with all the junk thrown out.

Actually, my entire service is mostly ready, I just hit a very annoying limitation that I can’t for the life of me figure out. The goal of my service is to be simple: drop the .blend file, select the material, boom, done. The issue is that I have absolutely no idea how to figure out the correct physical scale that a material uses.

Most of the materials produced by various people on the internet seem to have a completely arbitrary scale (for example, the above Cherry uses a preview sphere with a diameter of 1.27 units and a scaling of 0.106, whaa?). Is this really the standard procedure in the Blender/Cycles community, to download a material and waste 5 minutes trying to correctly scale it to the local scene?

My ideal system would be able to say that “Hey, this material should be rendered on a 2cm preview sphere” or something like that. That would allow the creation of collections that are scaled correctly to one another (e.g. an architectural collection where a wooden flooring is proportional to a table cloth or a balcony railing).

How are people here handling materials? How are you fixing their scale or using them in different scenes? The material preview render seems to be a 2 BU diameter sphere. If the material looks good on that sphere, it will almost surely look bad on any live object. How are you handling this?

I’d really like my service to be able to automatically generate nice preview images like the above one (it was generated through my service), nicely tagged about the scaling (4cm above on the image). Would be awesome to have a high quality material library for Cycles that could compete with the Corona Material Library for example, but Blender/Cycles seems to be missing this essential feature of assigning a unit to a material to allow making a uniform set of materials that work well together.

Any infos on how I could make this would be more than welcome.

Cheers,
Peter

1 Like
(CarlG) #2

To me it sounds incredibly hard to generalize like this. A lot of downloaded assets I’ve downloaded for trying out for fun, has had scaling issues. They don’t have scales applied, or they’re scaled for export, or things was never modeled to scale at all. Asset quality has been questionable, to say it mildly, and few seem to care about it.

When I make my own materials, I try to either expose a GUI to the end user (myself or coworkers), or highlight the nodes you should be tweaking. In Blender, I’ve given up exposing a (node group) GUI, as it’s just too limited. Nodes exposed through a GUI can be limited (i…e range limit 0-1), whereas a top level node cannot. Also, an exposed slider can be temporarily use a bump strength control then replacing it with a value node to get a much better control of it, whereas a top level node has to be the value node with its way to high sensitivity. Nodes exposed in GUI cannot be given a tooltip, and stuff like toggles, radio buttons, paths, dropdown menus etc are not even supported.

I may even use a mix of coordinate systems depending on the needs:

  1. Object coords if object is depicting a real world scale, such as ceiling tile grid.
  2. Generated unreset coords whenever I need fades.
  3. Generated reset coords whenever I need object distortion following.
  4. UV coords, sometimes multiple UV will be required to control different aspects.
  5. Separate texture space generators because Blender is very lacking in this respect.

I will also have node boxes with info in them if special precautions need to be taken, like “add random vector to the coordinate to avoid negative due to using modulo” if I’m not using a custom continuous modulo function. Sometimes normals have to be considered too; a fake glass plate material using the z normal for the “transparent face” might not work if you tried using it for windows unless you rotated the window such that the orientation was according to what the material expects.

As for previewing, I prefer my own materials to be associated with an actual asset. I.e. a backlit hand or ear for a closeup skin shader, a relevant floor for wooden floor boards or floor tiles which may not be useful with wall boards or wall tiles because of coord inputs expected.

It’s just all kinds of messy.

(Péter Szilágyi) #3

Thank you very much for the detailed response. A bit unfortunate though, this was my first speed bump that I hit when starting to play with Blender (alas I wanted to render an existing model, not make a new one). It would be so awesome if we could dream up a solution that would make materials cleaner to use (or at least have some cleaned up collection).

Anyway, I’ll try to think about it a bit more. I don’t nevesarilly want to solve this for every possible complex material type (e.g. your special material with specific culling behavior based on Z axis is way out of scope), but I’d still like to create some baseline set.

I can figure out a multi-step approach which could get the correct scaling at the expense of user upload complexity, it’s just beating a bit the purpose of “dump the .blend, give a name, done” simplicity.

(Péter Szilágyi) #4

Ha! I’ve managed to solve half of my problem.

Just to recap (I managed to formulate it for myself too), my material service idea has 2 ideal requirements:

  • Users should be able to just dump in material files without “any” interaction at all, and the service should be able to generate a fully uniform, good looking preview renders for them.
  • The materials that the users download from this service should be all scaled proportionally to one another, so users can so-to-say make collections for themselves that make sense together.

The first requirement was a very annoying one because the only way to make a quality library is if all uploads/previews are consistent between each other. As such, any user generated images are out of question, as they are not reliable (bad model, bad lighting, etc). Since user uploads however have arbitrary roller-coaster scales, how can we make properly scaled automatic previews?

The idea was that generally when someone shares a material, it’s part of a scene. So, I can iterate over all the objects in the uploaded scene and get the dimension and scale of the largest mesh using that material. This gives us the ideal scaling the material author themselves used.

# Material clean, find the largers object using it
largest = None
for obj in bpy.data.objects:
    for slot in obj.material_slots:
        if slot.material == mat:
            if largest == None:
                largest = obj
            else:
                old = largest.dimensions.x / largest.scale.x * largest.dimensions.y / largest.scale.y * largest.dimensions.z / largest.scale.z
                new = obj.dimensions.x / obj.scale.x * obj.dimensions.y / obj.scale.y * obj.dimensions.z / obj.scale.z
                if old < new:
                    largest = obj

Later, when we need to generate the preview images on our sanitized model, we can actually re-dimension and re-scale our entire scene so that everything remains proportional, but the final dimension of our demo-ball and the final scale of our demo ball will match the user’s upload.

dimension = float(os.environ.get("RECYCLES_DIMENSION", "12")) # Default demo-ball dimension on the preview scene
scale = float(os.environ.get("RECYCLES_SCALE", "1"))          # Default demo-ball scale on the preview scene

# We need to resize the demo model to have the same scaling and dimension as
# the origin model the user uploaded to make a prefect render. To do that we
# need to resize the dimension first and then scale it to the weird thing the
# user uploaded with (yay bump maps that use absolute scales, nor relative).

# Scale all the objects to match their dimension to the user's upload
for obj in bpy.context.scene.objects:
    obj.select = True

bpy.ops.transform.resize(
    value=(dimension/scale/12, dimension/scale/12, dimension/scale/12),
    constraint_orientation='GLOBAL',
)
# Apply the scales to the preview models to lock the dimension in
for obj in bpy.context.scene.objects:
    obj.select = False

bpy.data.objects["ClothModel"].select = True
bpy.data.objects["FluidModel"].select = True
bpy.data.objects["SolidModel"].select = True

bpy.ops.object.transform_apply(scale=True)

# Scale all the objects to match their scale to the user's upload
for obj in bpy.context.scene.objects:
    obj.select = True
bpy.ops.transform.resize(
    value=(scale, scale, scale),
    constraint_orientation='GLOBAL',
)
# Adjust the camera focus for the depth of field
for camera in bpy.data.cameras:
    camera.cycles.aperture_size *= dimension/12
    camera.dof_distance *= dimension/12
    camera.clip_start *= dimension/12
    camera.clip_end *= dimension/12

And voila, the input scene can be an arbitrary roller-coaster material, but my auto-generated 0-input preview images will still look stunning and uniform (first column is the original scene, second column is the auto-generated preview of the material):

I’m quite happy about this solution because even if I were to stop here, the experience of uploaders is trivial and the experience of downloaders stays the same as it it currently with other material sources. That said, I’m now trying to figure out how to automatically scale the download assets to one another to truly have a proper user experience.

Will post some updates if I solve the remaining issues.

1 Like
(Péter Szilágyi) #5

Sneak peek teaser :wink:

2 Likes
(XYZero) #6

Oh wow…this looks like a great idea!
Nice work done, looking forward to see this up and running.
Thanks so much for your hard work done.

#7

Any updates on this?

1 Like