Dealing with non-standard scene unit scale/object creation sizes.

Objects in blender are created with a universal default size. The size is relative to the active 3D view’s grid size.

Most of us Blender users are working under the default/standard grid size of 1.0. However, working with UE4, I had adapted a scene unit scaling of 0.01 on Metric. After that change, my default 3D view’s grid had visually scaled up 100 times, while the grid scale remains at 1.0.

This caused all my objects created to be sitting at a whopping 2 meters in diameter. Not ideal for anything other than Architecture/Level design. So I set the grid size to 0.1. And then, my newly created objects all has a default radius of 10cm. Which is pretty good for character/character prop creation.

Problem is, all the amazing plugins that has object creation functions does not pay regard to blender’s scene scale, nor the grid scale. Therefore the objects created from them are often too small to work with.

I am too lazy to just scale them up one by one.

Here are a few pieces of code to help with it. The idea is simple.

UPDATE:

Grid_Unit_Size = area.spaces[0].grid_scale / bpy.context.scene.unit_settings.scale_length

This is the actual correct grid unit size for objects creation. Previously I have left the scene unit scale length out, which is also a key factor to object creation size.

How you get grid scale given correct context:


bpy.context.area.spaces[0].grid_scale
# This value is actually stored here:
# bpy.data.screen[<your active screen layout>].areas[<index of area.type == "VIEW_3D">].space[0].grid_scale

If correct context isn’t guaranteed:


<i># Finding the grid scale of your first available 3D view. (Yes, you are right.  Every 3D view has a separate grid_scale setting.)</i>

def getGridScale():
    if bpy.context.area.type == "VIEW_3D":
        area = bpy.context.area
    else:
        areas = [a for a in bpy.context.screen.areas if a.type == "VIEW_3D"]
        if len(areas) &gt; 0:
            area = areas[0]
            <i># Current screen contains multiple 3D views, use the first one available.</i>
        else:
            return 1
            <i># Current screen contains no 3D view, therefore can not get grid size, return 1 for safety.</i>
    <b>Grid_Unit_Size</b> = area.spaces[0].grid_scale / bpy.context.scene.unit_settings.scale_length
    return <b>Grid_Unit_Size</b>

    # It's also possible to just get a grid scale from a 3D view outside of the current screen layout,
    # there is no point in doing so since there will be no visual indication of what happens.

Note: If you type “bpy.context.area.type” in blender’s console, you’ll likely never get a ‘VIEW_3D’ as answer. Because when you type in the console, the context is the console, therefore the context.area.type will always be ‘CONSOLE’.

Note 2: Sometimes it is not the object’s dimension being too small, but one specific value that affects the visual size/width/thickness of the object, and in this case, the getGridScale() needs to be injected into the code where these specific values are present.

Inject location:


<i># The &lt;obj&gt; below points to the object that needs its size adjusted.</i>

obj.dimensions = obj.dimensions * getGridScale()
<i># Or divide the &lt;obj.dimensions&gt; by &lt;bpy.context.area.spaces[0].grid_scale&gt;</i>

Blender api for properties must be the right place to handle this so it work everywhere by default.

Hard coding such translations may also have some side effects, eg when user does change unit while allready work done on the scene.
Also some addons may use handfull sizes variables wich make it a lot of work to adapt.

In the meantime, why not scaling at export time ?
An exporter option would require way far less code and provide stable solution whatever your addons do.

I agree that Blender’s API is the right, and the best place to handle this. But it has yet to handle it exceptionally well.

But, this is not a translation hard-coded into anything. This is simply getting a variable from whatever the work environment you are in, and do whatever you wish with the value extracted. I don’t understand why this would have any side effect, or why you refer it as ‘such translation’? I’m confused.

I’m dealing with the works that happen within blender. Exporting is a totally different topic.

Have you ever created a decal using the DecalMachine? Amazing stuff.

But the decal was so small relative to the object I am applying it to… that it’s not even visible. Happened when I have 0.01 scene unit scale and 1.0 grid scale. The moment when I decided to fix my blender startup project and do some extra scripting.

This doesn’t work for characters at least. When rigs are scaled in the FBX exporters the scale is applied to all bones instead of just the root bone. I guess this is a Blender bug. Then in UE4 when the rig is imported, UE4 notices that the scale of the bones aren’t set to 1 so the scaling is reset to 1 for all the bones in animations, even the root bone when that should still be at 100 (for example) meaning you get a tiny animation. This is a UE4 bug.

Because of this you usually go with metric 0.01 scaling because that way you can export at 1 scale and things just work. The problem is that Blender wasn’t designed to be used at this scale so many things just break, like constant detail when sculpting with dyntopo, curves with taper/bevel for hair, many addons that deal with creating meshes and so on. Any bug reports on this results in being told it’s working as intended. I guess the scene unit scaling isn’t really a supported feature.

Until the FBX exporter bugs/UE4 bugs are fixed we’ll just have to deal with this and live with things not working the way they ideally should be.

I kinda agree. Except…

I once reported a bug related to the lattice over the scene scaling, and they actually fixed it!!

And I’m doing what I can to make the scaling less of a problem.