Hello Folks
I’ve registered here to share some of my work with Blender to people with the same interest within science or other topics. And for curiosity. I’m working on a 360° full dome, time-evolving, fly-through of one of our simulations at the MPIA Heidelberg showing planet formation processes within a turbulent gas/dust disk around a new-born star. Easy, right?
From all software tried out, Blender showed to be the only free software capable of doing 360° (tiled) full dome rendering within adequate time and since it comes with Python, it is thus easy to import ASCII data, even if it is 2.5 million particles (XYZ-Data and loaded in one single object as vertices). But time evolving our data was a bit tricky and I would say is still not optimal.
I would now like to get some feedback and ideas from you guys to improve both visually and technically. Here is some screenshot:
You can find the Videos, Blend-File and Raw Data here:
https://drive.google.com/folderview?id=0B9PTIPb0zvzOUDUxWmdrN1Q4Q2s&usp=sharing
Please note: This is not final! We are using 11 dummy time steps of our simulation as keyframes, in the end we will use a lot more time steps (and intermediate time steps).
I would appreciate to get hints for the following problem:
-
We are using a shering box domain with periodic boundary condition. Therefore particles leaving the simulation domain reenter the domain on the opposite side. Because that, I told Blender to “dump” leaving vertices at a position out of the render distance and get them back in the next keyframe. You thus can see particles popping in oder out in each keyframe. See the aframe file for an example of this issue. Anyone a good idea how to compensate for this? I hope this issue will solve when using more time steps as source data. So less particles will pop in/out each keyframe and thus the viewer will not notice. But a serious solution would be better and one could use less data as input an still get the same result. best would be to somehow fade in and out vertices.
-
I would appreciate to use Cycles instead of Blender Render but Cycles does not support halo materials. Any idea how to get an similar or better result with Cycles?
-
Moreover I would like to show the gas flow besides the aleady implemented dust particle data. These are stored in a 3-D grid. I would like to generate smooth volumes out of these data sets and then let Blender evolve these. Q: First, is it already possible to render volume from within the volume? And second, any idea how to calculate the volume mesh and let Blender do the morphing from one volume state to another? Example needed? A more complex solution I could think of is to use a different Volume mesh for each keyframe and let it fade out to the next keyframe, while the next volume is faded in. You know what I mean? But in my head it wouldnt look nice as long as we dont use every frame as keyframe and produce GB of data.
Thank you for all response and let me know if you could use my files or scripts for anything cool.
AS
ps: I would like to start to write an “Science to Blender” interface, but only with support and more devs, since I cant do that all myself. Anyone else interested to work on such a project?