Preventing Memory Crashes During Long Loops

Hey all,

So I have written a python script that generates a big plane with like 36000 vertices and moves each vertex’s z position and vertex paint color based on the Mandelbrot fractal set. It then renders the scene deletes the plan and interates the fractal sets zooming variable. This creates images that ‘zoom’ in on the fractal when viewed in order.

The problem is that this loop can only run 20 times before I get a fatal “Malloc returns null” error. I have done my research and I know that Blender has run out of memory and that, evidently, previously used plans were not deleted, which would free up memory, but instead ‘unlinked’ from the scene to be kept for other uses like the undo feature.

One remedy is to save the blend file and reopen it, but if I am going to run the loop 1000 times that is not very practical.
Is there a way to clear all data, free up memory and continue the loop? Any other remedies?

Thanks

Before you unlink your plane from the scene fetch the datablock into a variable.


me = ob_plane.data
scene.objects.unlink(ob_plane) #Unlink the object and the datablock becomes unused, thus deleteable from memory.
bpy.data.meshes.remove(me)   #Remove the mesh data from memory.

After you unlink the object from the scene, remove the mesh datablock from memory. Put the remove code inside your loop.

http://www.blender.org/documentation/blender_python_api_2_65_release/info_best_practice.html?highlight=remove

wrong thread…

Hi beckettsimmons,

As an alternative solution you could build your mesh as a BMesh and then write that into your plane mesh.

See the code below for an example. One of the advantages is that you can clear and free the memory used by the BMesh as you like.

import bpyimport bmesh
from time import time


t = time()


# Get the active object, its mesh and create a new BMesh
ob = bpy.context.active_object
me = ob.data
bm = bmesh.new()


# Rebuild the mesh 1000 times
for i in range(1000):


    # Add the vertices of the mesh (there are 10000 of them)
    for j in range(10000):
        bm.verts.new((j/10000,)*3)
    
    # Write the bmesh into the active object's mesh
    bm.to_mesh(me)
    
    # Clear out the BMesh ready for the next rebuild (note that this doesn't clear the active object's mesh)
    bm.clear()


# Free the BMesh from memory
bm.free()


print(time() - t)

Anyway, it’s an alternative. Hope that helps and let me know if you need any other information.

Cheers,
Truman

@Atom I am not sure if you are saying that I posted in the wrong forum or if you posted on the wrong thread… But I tried what you suggested but I only got 4 more iterations that last time and the was a “Calloc returns null” error.

@TrumanBlending I have an older version of blender that does not support BMeshes and on the latest version my script isn’t compatible. So while I am working on my script may I ask, what does using a BMesh essential do if I have the generate the regular plane anyways?

Also, I tried not generating and deleting the plane every iteration and instead just generate one and reuse it. But this just gives me the same result. Looking deeper it seems that I am generating a new material every iteration, although I doubt having only 20 materials should use up all my memory…

I’ll continue looking into this. Thanks for the helpful responses!