pretty nbody simulation

Hi all,
First of all I have never used blender before.

I have created a nbody simulation which simulates the motion of multiple stars moving under the influence of gravity in Vpython which is producing acceptable results in terms of the physics but the output is extremely low quality (it just consists of black spheres on a white background). I was hoping to produce something as shown in the screenshots below where the red/orange coloured stars are static and only the white coloured stars are moving.



Is it possible to use blender to produce something like this?
If it is possible how do I feed the nbody code into blender - do I need to provide a CSV with the positions of the stars at each timestep and blender can render/animate the particles. I was thinking a CSV may become unmanageable if there are thousands of particles.

My main focus would be to have the smoke/lighting effects.

It would be nice to be able to interact with the simulation by being able to zoom in and rotate the screen which Vpython can do for me automatically but this is not the top priority.

Thanks in advance!

A few months ago I created an nbody star cluster simulation entirely within blender. This video shows about 500 stars in two small clusters contracting to their local centers and then colliding. I doubt that I did it the “right” way, and the code is sorry mess, but it does work.
From a high level this is what I did:

  1. When the script is run, it creates a list of x,y,z star locations and radiuses
  2. It creates a single object called galaxy
  3. It iterates through the list creating vertices, edges and faces in the shape of a cube and assigns a material, using the bmesh module
  4. It renders the frame
  5. Blender moves to the next frame, which calls a handler function triggered by the frame change.
  6. That handler, looks at the locations (and radiuses – (uniform density)) and calculates force vectors for each star against every other star. Once completed (N*N) the force vectors are converted to velocities and new locations.
  7. The old galaxy object is deleted, and a new one is created and new verts/edges/face created using bmesh
  8. New frame rendered, GO TO STEP 5

In your case you seem to have all the location data calculated beforehand, so you wouldn’t need all the calculations. If you need some code snipets show a bit of your csv data, like 5 lines.

I spent hardly any time on materials, compositing it looks like this:

Hi Photox,
That looks exactly like what I would like to achieve. You have given a very detailed answer but unfortunately I know zero about blender - will install it tomorrow and see if I can make sense of your method.

As for the csv I don’t actually have one (could create one easy enough I think). I suggested a csv as I wasn’t sure how to get my Python code into blender, if it is even possible. If I can plug my code straight in to blender that’s even better. I have my Python code using a multi core module (parallel Python) to speed up the simulation as well as trying to implement a N*log(N) so would like to retain that functionality if possible.

Very interested to hear more about your method, especially regarding the materials if you are willing to share.

Much appreciated!

The materials and rendering stuff is really pretty simple and you’re welcome to what I have. CSV seems like a reasonable choice, although you could just pickle a dictionary too, using the frame number as the key. The big question in my mind is how many stars are we talking about (and how many frames) and whether or not to load the whole shebang into memory or to load just those locations as needed, per frame. In which case you would need f number of CSV / dictionaries.

In general 24 frames per second is the norm, so for 20 seconds you need 480 frames, and thus 480 sets of locations. So if you only have say, 1000 stars you could load up everything, if you want 10,000 blender might start to choke and you would need to split them. If you can create a big pickled dictionary I can probably modify my code to use that data instead of my somewhat fudiged and naive alg. One issue I ran into is that because my stars are moved on a per frame basis (not functionally, but sort of sampled), they could potentially get too close to one and other and create high forces, and so I had to enforce a distance limit between them based on their radii.

So long story short If you are able to pickle a small dictionary, say 50 stars, for 100 frames, I can see if I can get some template working.

Would love to see your materials and rendering files. Looks like your knowledge far exceeds mine - I had to google “pickle dictionaries” too. I have only learnt the bare essentials of Python to enable me to do the physics calculations.

Good point about the number of stars which I forgot to mention in my first post. Initially was planning to start off with about 2000 stars using the NN method then if that goes well I was thinking of bumping it up to about 50000 or 100000 using the Nlog(N) Barnes-Hut algorithm. This may make the CSV not a practical idea. If you think the pickle method is more efficient at larger numbers of stars I can study up on that.

In my current simulations I was using 30 frames per second which ran for about 5 minutes so about 9000 frames for 4000 stars.

I have just had a look at what pickling is and looks like something I may be able to do. My understanding is it would create a .p file which is a dictionary where in this case the unique key would be the timestep (1, 2, 3, 4,…) and the values would be the positions of all the stars? If that is the case I can try whip something up for 100 stars as a test and provide that to you. Would this method still be practical if we used 50000 stars?

Thanks for the help

Hi,
I want to work on Nbody simulation and @Photox made the video which I want to achieve but unfortunately, I don’t know python :frowning:. How can I achieve this in Blender using some modifiers or maybe particle systems?