Overriding the blender units restrictions in game engine?

It looks like the biggest object Blender will actually render in game engine, and maybe in any mode, is 5000 BU, and the smallest is .001 BU. Is there any way to override this? Even if it requires modifying the source code and custom building a super-fast graphics machine, I’m still interested in the answer.

If you’re curious why I am asking . . .

I am working on a simulation of the Apollo voyage to the moon for a science camp. I know that the usual way to do such large-scale things in Blender is to hang the really big objects (like the earth and the moon) in the background. But I want to make this more realistic. I want the kids to be able to actually take the Apollo spacecraft from Earth orbit to a translunar trajectory and then into lunar orbit, all to scale, and over the space of the actual three days it took for the Apollo astronauts to get to the moon

The problem is that the minimum and maximum blender units don’t accommodate enough orders of magnitude to keep the earth and the Apollo spacecraft to scale. As I said above, it looks like the biggest object that Blender will actually render is 5000 BU, and the smallest is .001 BU. For short, I refer to .001 BU as 1 BU-k (one thousandth of a BU).

Well, the earth has a diameter of about 12,742,000 m, meaning that if I give the earth a diameter of 5,000,000 BU-k (5000.000 BU), each BU-k equals about 2.5 m. But the Apollo command and service module was about 11 m tall (4.4 BU-k) and just under 4 m in diameter (1.6 BU-k). So I can render the basic shape of the Apollo, but not any details (windows, thrusters, etc.). That’s why I’m looking for a way to override the BU restrictions.

Sorry if this is one of those you-need-something-way-more-powerful-than-Blender questions, but it can’t hurt to ask . . .

Well, you already know the dimensions you are going to work with.

There are some facts you should take into consideration:

  • when you measure in km a mm is nearly zero.
  • you can’t see that far
  • your computer has a precision limit when dealing with numbers
  • if you have earth and sun in the same scene with proper scale, you would not see earth because it is so tiny or you would see the sun only when looking into that direction, because it is so far away.

The point is what do you want to achieve? Which leads to what do you want to show?
If I take your example (Apollo) form above you need to think about the timing too. Somehow I think real-time is out of question, because nobody wants to play that for hours or days. And if you play around with time, you can play around with distances as well.

I guess you want to show the interesting parts of this sort of simulation:

  • start
  • landing
  • reaching orbit
  • leaving orbit
    etc.

I suggest you focus on this local events within its local space.
You can skip the time and distances between this events. If you do not want to you present a different view e.g. time lapse, schematic representation, or simple cinematic cuts.

To answer your question: There is a natural limit on precision. It is not 5000 BU and not 0.001 BU. But with the given limits you will not hit the real once. They could result in real strange effects. Beside of that the best precision is nearly the scene origin.

There’s hardly any game engine that deals in size differences that are found in space and that is why these projects can’t take the direct route of handling the distances and dimensions as they are. As with most game design aspects you just have to make things look and feel like you want instead of simulating the real world model.

Thanks. I understand that this is probably asking a lot of Blender. (The reason for the sim to last three days, by the way, is that this is a 4-day camp, so there’s plenty of time to do this in real time, make course adjustments, etc.)

Perhaps the better approach would be to dynamically adjust the scale of the objects. So, as the Apollo moves away from the earth, blender would scale down the size of the sphere representing the earth and scale up the size of the sphere representing the moon. If I calculate it right, I can then create the overall effect I’m going for without needing to create the real sizes and distances to scale.

If anyone foresees any problems with that approach, let me know. And if you’ve ever done anything similar and already have some Python scripts developed, feel free to share. Thanks.

I believe the way torakunsama dealt with it was to have two scenes with cameras linked in rotation, but the velocities of them are different. This means the objects can be similar size but will appear to be vastly different.

I’m not not sure what you mean by “cameras linked in rotation.” Did torakunsama post an example anywhere? Thanks.

Sigh… Like I said it’s asking a lot of any piece of software that stores 3D coordinates let alone a game engine.

This limitation in data storage and handling is in fact met when working “only” on Earth as well. In architecture it would be cool if you could store building coordinates in an absolute space withing millimeter accuracy but the scale is too wide. When you handle any operations that need to measure which point is further from another point the difference between 500 000 000 millimeters and 500 000 001 millimeters is not accurately measurable and results in problems in 2D already. The 3D rendering is even more prone to this effect causing faces to draw distorted and in wrong order.

Thinking one could simply model something like the solar system with real world dimensions without some hacks is a flaw in one’s technical design but not an uncommon one.

He meant you show several scenes at the same time. Each one with it’s own scale. This way you can show low scale objects (near distance) and large scale objects (far distance). It is a matter of the perspective how the objects appear. You simply do not need a moon in full dimensions. A tiny one will do the same. With the right camera placement it looks like it is a full scaled moon.

Here is an example:

While it does not really match your requirements (you do not need the transition) it demonstrates the principle. Imagine the green scene is the low scale scene with moons surface and apollo space ship.
The red scene can be the high scale scene with the full moon.

The rotation of the red scene is a result of the rotation of the camera that shows the red scene.

I will have to try this. But sdfgeoff said above that there were cameras “linked in rotation.” This example doesn’t seem to show linked cameras, just the red-scene camera rotating. Would I also need cameras “linked in rotation”? If so, is that a parent-child relationship? (Is parent-child even possible between scenes? I’ve never tried that . . .)

You need to establish this “link” via Python. This means you run a Python controller that constantly updates the transformations of the scene cameras.

Be aware as one scene has a different scale the copied transformation needs to be converted into scene space.

E.g.
you have two scenes:
the low scale scene shows the apollo module and has a scale of 1BU = 1m
the high scale scene shows the moon and has a scale of 1BU = 100 km

When you move the camera in the low scale scene by one 0.1 BU (=0.1m) along X
the camera of the high scale scene moves 0.0001 BU (=0.1m) along X.

The orientation can be transferred 1:1 if it is assumed both cameras point into the same relative direction. (In the above video the camera of the red scene changes it’s orientation while the green one does not. )

The scaling can remain untouched if you ensure nothing scales that exist in both scenes (a scale in the camera has no effect).

The synchronized scenes work in a way to make big scales possible. But even then there are limits.
I made an example for you.
The local scene’s limit is represented by the blue grid cube.
You can increase the scale, just avoid going above 100000, or below 0.000001, these are critical limit for the BGE.

You can change the scale to go faster or slower.
Aaaand, you can use it as you please!
Hope it helps!

Attachments

Space_synch.blend (1.14 MB)

Here is a quick proof of concept:


There are two scenes:

  • low_scale with the space ship
  • high_scale with the sandmoon

You can orbit the camera by holding the right mouse key.
You can move the camera with <a>/<d> (just along X).

The scale is taken from the cameras in the scenes. It is just 1/100 that you can easily see the effect.

Attachments

Apollo_demo.blend (302 KB)

Thank you Monster and torakunsama. Extremely helpful.