How to optimize a Minecraft-like Voxel system?


I’m messing around with a terrain generator like Minecraft, using cubes as voxels. So far, I can get blocks on the screen using OpenSimplex in a tentatively acceptable manner. However, predictably, I’m running into a brick wall with performance, as I try to generate more and more blocks. Generation time runs away, while frame rates plummet.

So far, nothing comes to mind that could effectively deal with either problem, but I do not know everything.

Future problems include: 1- ow to make actual mountains out of this…Right now, it would only work well for subsurface terrain. The best I can think of would be to generate 2d height data out of the noise, and then stack blocks up to that height. 2- water. These are problems I can probably figure out but not worth wasting my time until I know if there’s any way to make enough blocks on screen to be worthwhile.

Any ideas/hints/tips/tricks/insults are welcome and appreciated!

Here is the generator so far:

from bge import logic, render, events, types
from opensimplex import OpenSimplex

seed = 123
chunkSize = 15
noiseScale = 0.09
threshold = 0
blockSpacing = 1

noise = OpenSimplex(seed).noise3d

s = logic.getCurrentScene()

cs = int(chunkSize / 2)
for x in range(-cs, cs, blockSpacing):
    for y in range(-cs, cs, blockSpacing):
        for z in range(-cs, cs, blockSpacing):
            n = noise(x * noiseScale, y * noiseScale, z * noiseScale)
            if n > threshold:
                o = s.addObject("Cube")
                o.worldPosition = [x,y,z]

Voxels are challenging due to the sheer quantity of them you have to deal with. A single 16x16x256 chunk in Minecraft contains 65,536 blocks (air is also a block in Minecraft). You can’t afford to place every single voxel as a game object, there are just too many.

Store all of your voxel data separate from your scene, and then find ways to determine which voxels are visible and only then have them represented by game objects. I think the way Minecraft does it is by only including blocks that are adjacent to a transparent block such as air or water.

1 Like

Thanks…Good idea. I will see if I can get something like that working. However, it doesn’t help with generation time (if anything, makes it worse). But obviously, I’d be generating chunks of blocks at a time so I just need some way to process that in the background or thread it (which I think I did something like this in a previous project).

I also just found this: Dynamic terrain loading speed which is more food for thought.

I recommend storing your voxel data in python data types, not blender types. If you were to store a 3D array of block ID numbers that would be vastly faster than adding game objects to a scene. If you used numpy arrays and lumpy vectorized operations instead of looping, your generation will be 100-1000x faster.


Also could look at offloading the pure generation to a subprocess, but doing that in the BGE with no experience is tough.

I think in Minecraft they even go as far as only instantiating faces, not even the whole cubes.

In UPBGE (BGE fork), you can use KX_BatchGroup to make things a bit less traumatizing for the engine.

I wonder how far you could go with this :slight_smile:

1 Like

I’ve heard about numpy arrays but have not run into a situation where I needed them. This may be it! I’m working on separating the data generation from drawing blocks now.

Yea, I have some ideas how I might be able to thread this…Well, for one Python has a threadding module but not sure how well it will play with BGE. An easier approach, however, is to use the Module feature of the Python controller logic brick…IIRC, you can put data in global variables that stays between calls to the function you specify with the brick. Doing this, I can limit how many blocks it generates and draws per-call to said function, giving back time to the rest of the engine in between. Hopefully that makes sense lol

Thanks for the replies…very helpful info so far!

It’s hard to say how far I will get with this - BGE does NOT seem well equipped for procedural geometry of any kind.

My actual goal is a very basic destructable, procedural terrain engine for experimenting with some non-Minecraft-like games. For example, a city builder with procedurally genertated voxel buildings that are always unique.

This is just the first step to see if BGE can remotely handle it. If/when I even solve the performance issues, the next major terrifying hurdle will be wrangling the terrain into nice-looking biomes.

It kinda works, but each thread also slows down the engine. The Python interpreter makes sure that 2 low level Python operations aren’t done at the same time (Global Interpreter Lock). This means that blocking calls won’t block other threads, but computations will not be parallel. Just layered, kind of.

Yes it does: You would process for a given amount of time, then store your state, and keep processing on the next frame. Makes total sense, although it can be annoying to implement, but certainly the easiest solution.

Yes, I would advise to first design your voxel world system, have it to work, and then think about scaling that up :slight_smile:

1 Like

I did a few minecraft animations in the past and would suggest you a few things if you are going to make an animation.

  • use remesh modifier to generate the terrain
  • take a screenshot of minecraft, and use as a far background image. (With a bit of DOF, the difference is unnoticeable)
1 Like

I would do something like a planet, and decide how to operate on a small patch when you click,

1 Like

Jeacom - This isn’t a rendering or animation… I’m trying to make a real-time, Minecraft-like terrain generator. using the Blender Game Engine. Good tip tho, anyway! Actually reminds me: maybe I could design an imposters system for distant blocks to help with framerates (since mesh-level operations/generation seems impossible at this time, in the game engine).

That is an amazing demo… I will definitely take a closer look. Thanks!

I would not go saying that.

you need to break your world up into small chunks to edit, but not too small that the scenegraph explodes.

BluePrintRandom -
I chose the word “seems” carefully :wink: Very cool demo…I remember from previous experiments, there were some options available to manipulate meshes, but iI found nothing like other engines I’ve used such as Panda3D and Unity which both have strong APIs for creating custom geometry on the fly.

In this case, being able to only draw block faces that are visible would help a lot, but to do that seems like I’d either need to delete faces from my “master” cube(s) or better yet, draw each map chunk from scratch as one giant mesh of just visible faces. It’d be a lot of confusing code for my level of skill, though - placing cubes on an imaginary grid is 100x easier lol

Anyway, thanks! I have other terrain generation experiments this video might help with (at least illuminates that there is a path forward, even if he doesn’t share his code).

Edit: Oh i see - that video is using UPBGE which I have yet to delve into.
Edit again: I just learned BF is removing BGE after 2.79, so I’m now running UPBGE. Pro-tip for Gentoo: install dev-cpp/tbb if you get an error about missing - took me an annoying amount of time to figure that out.

I was just doing some research into OpenVDB and had a really weird thought about this conversation:
Have you looked at using META Cubes to create your volume? They generate a field that forms a continuous skin.

Perhaps it’s just a crazy thought, but it might be work trying?

Doesn’t work in the BGE, but the idea of just rendering the “skin” is actually the way to go!

I would skip thinking about minecraft, instead think of mesh editing operations that produce geometry like minecraft.

basically you can take a ‘surface’ made of meshes, skinning data that exists in the scene already.(think a single layer of cubes and some caves here and there pre existing)

when one edits the ground, you use perlin noise to deal with what is uncovered.

I have some demos (with source code) on how to use mesh manipulation to create culled voxel chunks.

In short, the performance bottleneck is in bge.
No need to actually use numpy or threading.
You should never use a cube instance for voxels.
Creating a culled mesh surface is the standard approach.

You can’t create polygons during gameplay, but you can have donor meshes which polygons can be rearranged into the desired mesh and texture coordinates.

Take BPR advice with a grain of salt…

have a blast! :slight_smile:

1 Like


That looks really nice. I will definitely take a look and maybe use it! No point in reinventing the wheel if I don’t have to!


in upbge you can create a batch to turn many objects into one, to increase performance.
i’ve answered to a similar question about a minecraft like game here:

and posted code on how to set up a voxel sytem.
a method for updating it is to delete the batch, delete all the objects that form it, and then create them again. you need to mark (or have a parallel matrix) the objects that will be visible to the player at every frame, and then add those objects and not the rest during the geometry-creation step. that will make it even faster.
the rest is just algorithms for looking at every block in the matrix and determining if they should be visible, if there should be light, etc. that is raytracing and stuff.
changing a single block is easy, because you know which block it is, its coordinates, and can change it in the voxel for adding/removing. things like explosions, fire, water, lava or sand is what becomes more problematic.

jesusmora -

Just for grins, I applied batching to my original cube/gameobject approach. Without taking measurements, it doesn’t seem to help much, so I’m either doing it wrong, or will still have to deal with individual faces. I just did a subjective test of a 32x32x32 chunk with and without batching.

This knowledge will still help in the long run, though, so thanks!

Here is as far as I’ve gotten fwiw:

from bge import logic, types
from opensimplex import OpenSimplex
#from mathutils import Vector

seed = 123
chunkSize = 16
noiseScale = 0.1
threshold = -0.1
blockSpacing = 1

noise = OpenSimplex(seed).noise3d

s = logic.getCurrentScene()

class Chunk(list):
    def __init__(self, x=0, y=0, z=0):
        self.x, self.y, self.z = x, y, z

        d = [
            [ [0 for dz in range(chunkSize)] for dy in range(chunkSize) ]
            for dx in range(chunkSize)

        for dx in range(chunkSize):
            for dy in range(chunkSize):
                for dz in range(chunkSize):
                    n = noise(

                    if n > threshold:
                       d[dx][dy][dz] = n

        super(Chunk, self).__init__(d)

    def neighbors(self, x, y, z):
        # Return 6 neighboring blocks
        return [
            self[x+1][y][z] if x+1 < chunkSize else 0,
            self[x][y+1][z] if y+1 < chunkSize else 0,
            self[x][y][z+1] if z+1 < chunkSize else 0,
            self[x-1][y][z] if x-1 >= 0 else 0,
            self[x][y-1][z] if y-1 >= 0 else 0,
            self[x][y][z-1] if z-1 >= 0 else 0

    def render(self):
        objects = []
        for i in self:
            i = i[0]

            if not all( self.neighbors(*i) ):
                o = s.addObject("Cube.002")
                o.worldPosition = i

        return objects

    def __iter__(self):
        for x in range(chunkSize):
            for y in range(chunkSize):
                for z in range(chunkSize):
                    v = self[x][y][z]

                    if v:
                        yield ((x,y,z), v)

chunk = Chunk()

b = types.KX_BatchGroup( chunk.render() )

Predictably, the Chunk.render method makes the most difference in frame rates, as it filters out definitely invisible blocks from being rendered.

This code generates a single 16x16x16 chunk pretty fast (less than a second I think on an AMD APU), despite not being the most efficient code I’ve ever written, and actually basically ignoring everything I know about Python and some of the advice given here. Nevertheless, I should be able to adapt and improve on some of it for generating individual visible faces instead.

Why oh why can’t we just generate meshes from scratch on the fly :’(

what about having a model for 4x4 voxel


one model

0000 0000 0000 0000
1000 0000 0000 0000
0000 0000 0000 0000

maybe some way to generate them automatically?

and have it all have uv mapped faces you can change

so 1 object represents 4 * 4 * 4 blocks?

so the mesh are pre-generated?

edit :

o poo 

43 252 003 274 489 856 000 combinations for 3x3x3