A likely silly question.

When you create games like fps, how do you mentally prepare your maps?
is it on a 1:1 ratio? like a mile/kilometer irl is the same in the blender map?
I am probably too literal, should i create/visiualize a map with the same metrics and info as a real map. Does it take more memory?
Is there is a good tutorial when it comes to appropriate sizes?

This depends on many visual, game play and story based aspects.

Visual: What looks good?
Game Play: What is good playable - long/short walk distances, walk, run speed, camera range, obstacles, bottlenecks …
Story based: How “believable” is the game world? How does the game world fit into the story? How does the story fit into the game world?

It is in large parts very subjective. It should “feel” good.

A house with a few rooms might feel good if the avatar is slowly walking through the rooms. An open area will not fit as it takes to long to discover the areas. But it feels much better when using a vehicle ;). But a vehicle offers other game experience (e.g. racing).

Just some thoughts

just make sure whatever ratio you pick to apply it to everything. You want to make sure everything is proportionally correct.

OK thank you all.
What is the most common size for a fps level, & size, what is safe memory wise?
Do you folks use metrics/imperial?

Memory-wise, size is not relevant, you can make your map as big as you want. Well, until you reach numeric representation constraints but things get worse well before that.
The problem lies in the resolution because the amount of triangles you can spend to render a scene is limited.

thank you pgi, what are the constraints and what is the amount of triangles I can use? & wouldn’t be quadrangle be better? I thought ideally for best results you wanted to avoid triangles.

The numeric constraints derive from the way decimal numbers are usually represented in game engines but this is not really interesting unless you’re dealing with extreme accuracy - like scientific simulations. Suffice to say that there is a limit in both how large a number and how precise the difference between two numbers can be (biggest is 34 followed by 37 zeros, most precise has 7k decimal digits).
Don’t care about that.

It’s ok to avoid triangles when modeling, in engine terms you count triangles because real time rendering engines “reason” in term of triangles (they are the simplest polygon kind and have nice geometric properties that the hardware is engineered to exploit).
Quads are transformed to triangles automatically by blender so you don’t have to do it by hand (there is a corner case when the four vertices of the quad do not lie on the same plane but the worst you’ll get is a slightly different looking shape).

How much is too much when speaking of triangles in a scene is a question that has no specific answer.
It depends on the hardware where the game is deployed, the complexity of the effects you want to apply to your models, the size of the output window and what the games does other than showing stuff.

You can roughly estimate how complex a scene can be by taking your system as reference, modeling a scene with lots of polygons and the effects you would like to have (like shadows, lights, shaders…), possibly applied to a single mesh, starting the game engine and taking a look at the framerate while you move around the camera.
If the framerate doesn’t drop under 50fps you add more polygons.
When you have a stable framerate around 50 fps you can estimate the “level” of your hardware (high, mid, low end, that sort of thing).
At that point you’ll have some sort of way to measure how many triangles you can count on for a scene.
Then you decide how much you want to spend for the environment, for the player, for other elements …

It’s a very boring thing to do :D.

ouch. che mal di testa!
Grazie pgi :slight_smile:

Short guide line:

1 BU = 1 meter

Human character height ~1.8 m

Thx Monster!

If you want large maps, because of the constraints, you may want to lower the ratio. You don’t want to go through the game and ‘hit a brick wall’ because it was too large to render. Either it will close your page or freeze; at least this is what I went through.

Don’t listen to Monster because he has no clue what he’s talking about. If you want to get a feeling of real world scale in Blender Game Engine , you have to make everything 3 times bigger. 1 meter = 3 blender units. A 2 meter person would be 6 blender units in height and the camera needs to be 90 degrees FOV with a sensor size of 45. Those are the only settings that work if you want to have a scene that feels like real world scale. If you follow Monster’s advice , you will notice that everything feels way too small and way too close.

@blenderer2012 - ^ A moderator of the BGE forums that’s been using the engine for literal years has “no clue what he’s talking about”. No offense intended, but I’m pretty sure he knows what he’s talking about, and I think you might be confused. 1 meter doesn’t equal 3 blender units to the BGE.

EDIT: Haha, even I got my facts wrong initially. It’s as Monster said - 1 meter to 1 Blender Unit. 2 BU = 2 meters, or 6 feet. :stuck_out_tongue:

So, a 2 BU tall human would be 6 feet tall, which is a nice average.

Here is a scale and distance test scene. When you press spacebar key it adds an object of 2 meters in height at a distance of approx. 80 meters away. It should take you about 20 seconds to reach that object because the camera is moving at 4m/sec ( 12 blender units / sec ).


ScaleAndDistanceTest.blend (252 KB)

Ok lets try this , just so you guys can understand that even the programmers have no clue what they’ve coded there. By the way , the physics engine is off too by a factor of 3 or 4. Let’s take a real world example. I think we can all agree that if you place an object that is 2 meters in height at a distance of 100 meters away , it would be pretty much impossible to shoot. Yes ? No ? Maybe ? You would have to use a sniper rifle and stand perfectly still or be stabilized in order to hit a target at 100 meters. That’s real world scale for you. Ok , now try to recreate the exact same thing in Blender-Game-Engine and show me the blend file. This is exercise is for SolarLune.

I don’t understand - in your blend file you’re using the imperial unit system, and you can see that 1 BU = 1 meter. You can move an object, and Blender displays that it’s moving in BU to meter steps. It displays that 1 BU up = 1 meter up on the Z-axis. The object that gets added sure appears to be further than 80 meters away. It’s more than 248 meters away on the X-axis alone…? I don’t know; maybe I’m not seeing what you’re trying to say, but it sure seems that 1 BU = 1 meter.

EDIT: Eh? I don’t get your question…? What would the exercise show? Just to make an object 100 meters away from a camera? Or to make an FPS setup?

You can see in your own blend file that 1 BU = 1 meter. Or, make a new blend file - the default cube is 2x2x2 Blender Units, but with the metric system showing, it’s a 2x2x2 meter cube.

It starts right here…

So lets recap what we know so far. Because the computer hardware is 20 times slower than what it needs to be , Blender Game Engine had to make some compromises in the calculations in order to simplify things a bit but SolarLune and Monster didn’t notice that some things are way off. LOL . A real bullet travels at 800 meters per second or more , which translates into about 2400 Blender units per second. Divide that by 60 frames per second and you get 40 BU per frame. The problem is that , Blender checks for collisions only once per frame so at that speed the bullet needs to have a length higher than 40 blender units in order for a collision to register. Agreed ? Yes ? No ? Maybe ? LOL

You don’t seem to understand what I’m saying. Go outside make some measurements and come back and you will realize that when Blender says this object is 100 meters away that’s not actually true. The size and distance feels off by 3 to 4 times.

Different kinds of projectiles travel at different speeds. If you’re modeling a weapon in your game after a gun that shoots bullets that travel 800 meters per second, that’s fine, but that equates to 800 Blender Units per second, not 2400.

High-speed projectiles in the BGE shouldn’t be modeled with actual bullets and projectiles. Even if you had a bullet mesh object that was that long to detect collisions correctly, as wide and tall as the bullet collision mesh object is, the physics engine may still not detect all collisions.

You should model high speed projectiles via raycasts. If you want advanced dynamics, like gravity dropping the bullets over distance, or wind affecting the angle, you can use successive ray casts to accurately model a bullet that travels very fast, that still takes time to hit a target, and take into account outside forces. You can also push the physics substep up higher to have the BGE detect collisions at a higher frequency than the logic tick rate (game framerate) would allow.

So you’re saying that things don’t feel far away enough with Blender’s perspective? I already know that you’ve mentioned altering the camera lens size (field of view), but that sounds like the solution to this particular issue. It’s worth noting that the default camera field of view is 49.134 degrees, not 90. Setting it that high indeed makes far things feel further away and appear smaller with distance.

EDIT: I’m finding that changing the camera’s sensor size has no bearing on the perspective taken; it’s all about the field of view.

EDIT 2: As a last point, if this is important to you, maybe you should investigate the true cause and solution to the problem. Post a thread to find out the reason. If you decide to scale things up to make them appear further apart, that’s your decision. However, that doesn’t change what a Blender Unit equates to.