Animating 500k+ spheres from data file

Hi,

My work deals with large scale granular simulations. These simulations are several thousand frames with 50+ GB of data (stored in text files, one per frame).

I have been using povray to render simulation data but wanted to try using blender because it has a better renderer and material support.

Some questions:

I want to create a python script that loads the sphere data, creates 500k spheres with materials and renders to an image. The data would be loaded every frame.

Is this possible?

Would I be able to do this form a command line? (-P option)

I want to use the command line because I don’t need to see the scene, This would also save me memory.

Thanks for the help.

yes, is possible. make an object, and then use your data to define the location of each vertex . I assume your data is point-order dependent, in that the first point on frame 1 moves to the next location in frame 2. We call that a shape key, and you would want to use frame 1 as the basis shape, and then each frame as another shape key for that vertex.

For a material, use a halo.

Thanks for the info!

I need to create realistic renders, sand, gravel, etc so the halo material wouldn’t work.

What I need to do is be able to create 500k spheres, or particles.

Generating 500k spheres will take A LOT of memory, I can run 64bit but object generation is very slow.
A particle system would be better

Can a particle system’s particles be manually positioned? I did not find any obvious way to do this from the documentation.

Thanks

Particle system won’t save you any memory and it will just waste your time.

When you write the python code to generate your spheres make sure you set their display type to bounding box, that will save you some screen refresh time. When they render, they will come out as spheres, but in the viewport, they will only be cubes.

I am attaching a small script I wrote a while ago to create an array of cubes. Perhaps this can get you going…?

Press ALT-P while the mouse cursor is over the text window. It will generate a 12x12 array of textured cubes.

Attachments

array_of_cubes.blend (158 KB)

Viewing the particles shouldn’t be an issue, I run in background mode.

I noticed that when blender creates each object, it doesn’t use any additional memory. Only when it renders the spheres does the memory usage skyrocket.
Is there a renderer better suited to this task?

Here is my current script:

import Blender
import types
import collections
from Blender import *
from Blender import Camera , Lamp
from Blender.Scene import Render


s_radius=.015

start_frame=20
end_frame=50
frame_skip=1

objects=500

path="C:\\Blendertest\\"
r_path="C:\\Blendertest\\render\\"
    
slist=list()

def makeSpheres(number,passedScene):
 baseMesh = Mesh.Primitives.UVsphere(3,3,s_radius*2.0)

 for i in range(number):
    tob = Object.New("Mesh")
    print i
    tob.link(baseMesh)
    passedScene.link(tob)
    slist.append(tob)
 return 
 
scene=Scene.GetCurrent()

c = Camera.New('persp')   # create new ortho camera data
cam = scene.objects.new(c)   # add a new camera object from the data
scene.objects.camera=cam    # make this camera the active
cam.setLocation(0,0,15)

l = Lamp.New('Sun')            # create new  lamp data
l.setMode('Shadows')               # set lamp mode flags
light = scene.objects.new(l)
light.setLocation(0,100,0)

makeSpheres(objects,scene)                    # add spheres to scene

for i in range(start_frame,end_frame,frame_skip):
    filename = 'pos%04d.dat' % (i) 
    f=open(path+filename)
    mdata=f.readlines()
    f.close()
    
    for line in range(objects):
        xyz=mdata[line].split(',')
        #print "vals", xyz[0], xyz[1], xyz[2]
        slist[line].LocX=float(xyz[0])
        slist[line].LocY=float(xyz[1])
        slist[line].LocZ=float(xyz[2])
    
    context = scene.getRenderingContext()
    context.extensions = True
    context.threads=8
    context.renderPath = r_path
    context.sizePreset(Render.FULL)
    context.imageType = Render.PNG
    context.sFrame = i
    context.eFrame = i
    context.renderAnim()

Object generation is very slow, 50k takes about 2-3 minutes. 500k would take way too long.

I created an empty test.blend file, I run this script with:
blender.exe -b test.blend -P render.py

for 20k I use 1.5GB of ram, it increases linearly so for 500k I would need more than 30GB, I can run 64 bit but my machine only has 12 GB of ram
I have tried both the INTERNAL and YAFRAY renderers, no change in memory usage
partsx and partsy don’t make any difference

Thanks for the help