Why does it seem like no Blender users back in the day ever modeled at real world scale

My silly hobby is taking ancient free 3d files from all over the internet and seeing what I can do with them in the latest version of Blender. I can’t think of any .blend file I’ve every downloaded that had the objects at real world scale. Lots of other files made in who-knows-what were actually built to scale. Just scale it up or down by 10, 100, .1 or .01 and it’s accurate to real world scale. Not so with old .blend files.

2 Likes

Between the non-existent measuring tools and a total lack of features that expected scale-accurate modeling to function at their best (even Blender Internal, your highlights burned out regardless because of 1990’s basic sRGB color math), Blender was an app. where you just eyeballed what looked right to you when constructing the scene.

The secondary reason is because Blender back then was considered a hobby app, so the vast majority of artists were less experienced people who could not afford anything better (to the point where even Carrara was out of reach).

1 Like

Those days the size of the cube was called radius in blender. And the uv sphere radius was called size :person_shrugging:

1 Like

I can’t speak for old blender files, but for the rest - because there was a time that in many cases, real scale didn’t matter that much. Personally I didn’t care about scale, until lighting features started taking scale into account. GI/Monte Carlo was a big part of that, and now of course in Blender lights and shadows care about scale whether you’re bouncing lights around or not… so, i almost always adhere to it now, unless I’ve got a reason not to.

And FWIW, Blender still doesn’t make it easy to work with scale…cough edit mode.

4 Likes

I am a user from back in the day.

The default cube size was “unknown”, and a lot of users were just experimenting with Blender, and size just wasn’t an issue. You went with what felt good.
The blender world was limited in size too. You ran out of space and had to resize and scale everything.

Since then, the Measure Tool and proper practices are the normal thing for me to do.

Today, I scale everything to actual size. It makes it easier to append and import objects into new real life size projects. I also make sure the center of origins are practical for all object too.

It is also good scale your projects to life size if you plan on selling your models online.

2 Likes

I agree with @Spin

Before blender v2.5 there were no units of measure really. 1 BU = 1 BU was the scale & measurements.

1 BU is one blender unit. 1 BU could be a foot, an inch, a meter, or a centimeter. At the end of the day, just scaling the object shouldn’t be a big deal. Should it?

Randy

2 Likes

It’s a problem when you have to scale every object and then some materials look different and also area lights will scale their size with their parents but point lights will not and the light strength will not scale.

And I’ve just now discovered that normal map strength appearance also changes after Apply > Scale.

image
image

normal map strength is still 0.2 in both. The wood material i used on 30 other objects all with different scales :frowning:

… correction, apply scale made the normals flip on that object, lol

1 Like

Blender started as a tool for mostly amateurs - who where the ones sharing files openly for free and on what markets existed back then. Most of the pros using Blender were working for small studios, freelance or Blender Institute.

Pros understood that there was in fact a scale value to real world in Blender. There always has been. Known widely or not. A 3D program can not exist in a scale enigma. There has to be a mathematical reference someplace to real world. Especially for physics and some lighting features. The only aspersions ever cast on this were from people who did not understand or care.

There have been scores of debates about the importance of a unit in Blender and the fact that it is arbitrary.

Again, for people working on professional projects - especially sharing between programs - and just a work ethic of common sense, scale and units as it related to real world was always important.

And as mentioned also by others, Blender was just a tool to play around with and learn and in the beginning it is common for new users not to really be focused on real world scale. There is nothing wrong with that and it is natural.

Keeping to real world units requires understanding how and why and discipline to keep to it.

Most of the time, even now, I will start out a project and start doodling even before I consider world scale. Some doodling’s it won’t matter.

But for serious work in a project pipeline, scale always has to be considered at some point and adhered to. It is the basic fundamental thing to keep intact.

2 Likes

3D printing community was very narrow before RepRap printers.
At same period, Photorealistic render engines were slow. So, it was a niche, too.
Users were simulating bounces of light by adding a point lamp, near surface in Blender Internal Engine, instead of using radiosity.
Except people using Yafray, the others using Sunflow, Povray, … were sharing scene file for renderer, not the blend.
Blender was promoted as a software to create short movies like Elephant Dreams or games like Yo ! Frankie !.
Most of users, sharing files, were hobbyists trying to create games without realistic physics.
Shared blend files were free stuff.
There was no market place to sell professional blend files.
Professionals using Blender were selling .obj files or .fbx on Turbosquid to have a sufficient client base, from outside of a very smaller Blender community.
In the early 2000s, the internet connection was not great to download 3D Files.
Tutorials were not videos, but written html pages with screen captures.
There were tutorials about architecture and CAD stuff mentioning distances and scale, but without 3D files.

2 Likes

It may be a bit of nit picking   :wink:   but :

Blender was initially developed as an in-house application by the Dutch animation studio NeoGeo, and was officially launched on January 2, 1994.

…from WikiPedia: Blender (software) ← additionallly with it’s own entry before the usual (disambiguation) link… )

 

Also:

  • when someother app is more the industry standard… then why does this also have problems like this… ( from 2012 )
  • a lot of 3ds or obj files from back then do have the problem that models of people may be bigger than cars or houses :person_shrugging: …because they were exported to be used in some graphic engine and then scaled accordingly
  • before the PBR era proper lighting was a special skill where the units may be second rated

…so there are a lot of reasons why proper scaling or better the unpoper or un-use of scaling is a general problem.
( That’s also the reason why studios or co-contractors use workflow rules or guidelines. )

5 Likes

I don’t know how big influence it has, but to this day I see a lot of tutorials (including from large channels) that eyeball everything. If people learn workflow from that and nobody point it out to them - they will be doing that, which will result in random scale.

This question is imo not only bound to old Blender projects + fighting subpar import export formats.
With most ZBrush projects and output, there is the same issue.
There is in many cases no real world scale and if you tried to apply it, the sculpting behavior inside ZBrush can easily get wonky.
Dynamesh resolution is bound to Zbrushs own worldspace.

The issue of mesh size can also affect Perspective in ZBrush. ZBrushs Perspective is wonky regardless if you try to adhere to real sizes.
To mitigate that, you need to follow old rules like “locking in” the perspective with a mesh with the wished size as the first subtool object.

1 Like

You can set your scale using scalemaster. And sync scalemaster to Zbrush native scale(Unify), so everything works as it should and exports to other programs at real-world scale.

2 Likes

will test it out once I fire that perpetual license up, thank you👍.
I must have forgotten that there is always a plugin to solve ZBrush issues.

Here are other cases where real world scale can be problematic;
example with single precision.

Jolt works best when using real world scale on dynamic objects. Its limitation is between 0.1-10 meters.

Erwin Coumans, maker of the Bullet physics engine wrote;
If you set Bullet physics with a Gravity of 9.8 m/s2
your Objects with rigid bodies shouldn´t be smaller than 20 centimeters.

https://pybullet.org/Bullet/phpBB3/viewtopic.php?t=367

For Havoc physics engine and Halo2, the recommendation was the following;

That was the thing I used to say to all the artists: don’t make anything smaller than a PC monitor, in any dimension, like nothing can be smaller than a foot-and-a-half or two feet!

EDIT; here is some Maths about accuracy and real world scale.

So if your units are metres, you’ll lose millimetre precision around the 16 384 - 32 768 range (about 16-33 km from the origin)

EDIT2
to counter the precision loss due to large world scales, Engines allow shifting the world origin. Documentation from Unreal;

2 Likes

People seem to eyeball even such things like “precise” UV mapping with low poly objects to something like a 128x128 image and wonder why they got some “smearing” effect… :unamused: ( and complain about this… )

1 Like

If i would decide to drop any model to the internet for free - i probably wouldn’t care too much about scales, because, well, its free.
If it would be a paid one - ofcourse it will have correct scale.

The reason why it wouldnt have a real world scale rigth from the start its because im in general doesnt want to waste time in typing numbers on the keyboard. I’ve done this always in the days im working in Solidworks, and its fine and only one correct way to deal with CAD, but not with regular poly modeling.
Blender behavior of dragging a value slider while holding Shift - doesnt provide enougth smoothness and control. So with real world scales, some details of the model will require, let say for Bevel modifier a values like 0.002, because Unit System is in Meters cause next step for model would be export to Unreal Engine (which uses meters).

Quite a few prominent youtubers teach scaling UVs to 0 for models made for game engines :expressionless: . Seen that in paid courses too…

This scaling to a specific position to “select” the color with this ? For a very very small map this might be even a simple solution to change the “coloring theme” of an entire asset… ( … or… not :wink: ). Of cours… all to zero… :person_shrugging:

1 Like

it’s cool way when you just want to render stuff in blender, but there are quite a few issues with that approach when you take it to game engines… stuff like static lighting now is encoded in a single pixel :smiley: (unless you provide separate lightmap UVs ofc, or use engine tools that generate lightmap UVs if your engine supports them… for example unreal do, but even though it does it can’t do good lightmap UVs based on UVs scalled to single point).

1 Like