# Blender - 3d Point Data usefull??

Hi Zeffii!
Great work!
I just wanted to let you know that I could write the data in another format!
I actually know the particle trajectories!
So I could give you a file like :
Track 1:length 100 (in time)
t: 100 values
x: 100 values
y: 100 values
z:
vx:
vy:
vz:
ax:
ay:
az:
Track2: length 55 (in time)
t: 55 values
x: 55 values
y: 55 values

Do you want such a thing?

@mgibert, it would be very cool to have that extra data to accompany the process, it will come in handy if I get utterly fed up:)
First I want to go through the mental process of figuring it out, a thought experiment if you like. It think this will help me put together a process to automatically remove redundant edge-loops from a complex mesh.

So the LPT tracks individual particles? i read through those files on your site, are you calculating trajectories yourself?

We actually track hundreds of particles simultaneously, thanks to a few codes developed in the group!
My latest achievement is to be able to track the full motion of a bigger particle (translation and rotation) together with some small tracer around that give us access to the fluid velocity field around it. This is so far the best experimental way to study the complex coupling between this big inertial particle and the turbulent flow carrying it.
This latest measurements are actually my real motivation to come back to blender in order to make a beautiful picture that can go onto the cover of a nice scientific magazine!
So far here is what I have about these measurements:
With matlab…
1-Initial tracks
2-Finding the ball
3-Reconstructing the big particle trajectory (translation and rotation)

My first attempt with blender:

It could be better! I am open for ideas!
Enjoy the data!

``````
import bpy
import time
from mathutils import Vector
# purposely coded verbosely in places. relax :)

def printTimeStamp():
# simple divider + timestamp
print("
" + "="*19 + time.ctime() + "="*19)

'''
def ShowBoundingBox(mode_switch):
bpy.data.objects["LPT_REP"].show_bounds = mode_switch
return
'''

def CreateMesh(num_param, data_set):
# debug prints, for flow control
debug_string = "num_param = " + str(num_param)
debug_string += " & data_set length = " + str(len(data_set))
print("Reaching CreateMesh with data: " + debug_string)

# make new mesh, add vertices using coordinates
Verts = []
for coordinates in data_set:
xfloat = float(coordinates[0])
yfloat = float(coordinates[1])
zfloat = float(coordinates[2])
unique_vertex = Vector((xfloat, yfloat, zfloat))
Verts.append(unique_vertex)

test_mesh = bpy.data.meshes.new("LPT_DATA")
test_mesh.from_pydata(Verts, [], [])
test_mesh.update()

new_object = bpy.data.objects.new("LPT_REP", test_mesh)
new_object.data = test_mesh

scene = bpy.context.scene
new_object.select = True

# looks like i should refactor CreateMesh and CreateWireBounds somehow
def CreateWireBounds(box_coordinates):
print("making wire bounding_box object")
print(box_coordinates)

#hardcode some edges
Edges = [   [0,1],[1,2],[2,3],[3,0],
[4,5],[5,6],[6,7],[7,4],
[0,4],[1,5],[2,6],[3,7]]

b_mesh = bpy.data.meshes.new("WIRE_BOX_DATA")
b_mesh.from_pydata(box_coordinates, Edges, [])
b_mesh.update()

box_object = bpy.data.objects.new("WIRE_BOX", b_mesh)
box_object.data = b_mesh

scene = bpy.context.scene
box_object.select = True

def ConstructBoundingBox():
# ShowBoundingBox(True) #optional for checking.

bpy.context.scene.update() # necessary sometimes
box_Vector = bpy.data.objects["LPT_REP"].dimensions
print(box_Vector)

# determin bounding box
box_coordinates = []
bounding_box = bpy.data.objects["LPT_REP"].bound_box
for vtex in bounding_box:
v_coordinate = Vector((vtex[0],vtex[1],vtex[2]))
box_coordinates.append(v_coordinate)

# make bounding box 3dGrid
CreateWireBounds(box_coordinates)
return

def InitFunction():
datafile = 'data_test.txt'

num_lines = 0
line_checking_list = [] # list to check for consistency
line_data_list = [] # list to append the various data values onto

dataset = open(data_directory + datafile)
for line in dataset:
items = line.split(",")
line_checking_list.append(len(items))
line_data_list.append(items)
num_lines += 1
dataset.close() # to be polite.

# detect anomalies first, before getting hopes up.
set_check = set(line_checking_list)
if len(set_check) == 1: #means no variation
printTimeStamp()
num_parameters = list(set_check)[0] # gives parameters per line
CreateMesh(num_parameters, line_data_list)
else:
print("There exists variance in the data, won't proceed")
print("At least one line contains unexpected data")
return

ConstructBoundingBox()
return

InitFunction()

``````

ok i have all the geometry needed now. next post should be a lot cooler (and less wireframe!)

@mgibert, do you have a version of that file with particle time-stamps for each datapoint?

Hi Zeffii,

Here is the format:
It is just one column that reads that way:

Nb of tracks in the file - ntrk
(ntrk times){
Trk index (from 0 to ntrk)
Nb of frames (how long is the track) - nbfr
First timestamp at which the track start
(nbfr times){
x
y
z
vx
vy
vz
ax
ay
az
}
}

If you do not like it I can write it in some other way!
Enjoy!

PS: This are not the same data!

Sure!
Freshly created for you:
TracksTest2.txt

The format is therefore:
Nb of tracks in the file - ntrk
(ntrk times){
Trk index (from 0 to ntrk)
Nb of frames (how long is the track) - nbfr
(nbfr times){
timestamp
x
y
z
vx
vy
vz
ax
ay
az
}
}

enjoy!

I just realized that I answered twice to your post in a different way!! And moreover none of them is really what you want!
I am going crazy!
What you really want is this : test_data_trkflag_timestamp_1.txt

It is like the old format (that you played with) except that I added a track_flag and a time_stamp. Per line, you’ll find:
trk_flag,timestamp,x,y,z,vx,vy,vz,ax,ay,az
Enjoy!

yes! that’s going to open up more possibilities Thank you, this is great fun.

By the looks of this it should be possible to reconcile some of the ‘broken’ tracks, by comparing the trajectory towards the end of shorter tracks to that of the initial few positions of other shorter tracks in the vicinity. Maybe matching, combining and ‘projecting/Lerping’ gaps.

I think perhaps algorithmically exclude tracks that appear to be mere fragments with no relation to other tracks. Anyway, we’ll see.

HA! (still no cool render, sorry) but this file is crammed with information interesting. some beautiful motion! Time to do some trilinear interpolation.

did you say you also have particle size?

By the looks of this it should be possible to reconcile some of the ‘broken’ tracks, by comparing the trajectory towards the end of shorter tracks to that of the initial few positions of other shorter tracks in the vicinity. Maybe matching, combining and ‘projecting/Lerping’ gaps.

That is true, and we have a nice algorithm to do that. I did not run it on that particular set of data.

did you say you also have particle size?

In the middle of these tracks that represent particles of 100um diameter trajectories, there is one big guy as we call it! (about 10mm in diameter). I’ll send you his trajectory in a few days.

It seems that you enjoy those data!!!
Have fun.

here’s a version that allows us to look at a subset of the data between some arbitrary lines.

``````
import bpy
import time
from mathutils import Vector
<b># purposely coded verbosely in places. relax :)</b>

# setup reading location, and global variables
datafile = 'test_data_trkflag_timestamp_1.txt'
lpt_name = "LPT_REP" # name of object for scene

<b># optional, use these to look at a subset of the data</b>
skip_value = 2 # default 1, is full set
use_tokens = True # True = set a cutoff point, False = entire dataset
start_token = 70000 # start importing from this line, 0 for start
break_token = 92000 # don't import beyond this line or until dataset ends.

def printTimeStamp():
# divider + timestamp
print("
" + "="*19 + time.ctime() + "="*19)

'''
def ShowBoundingBox(mode_switch):
bpy.data.objects["lpt_name"].show_bounds = mode_switch
return
'''

'''
def printVectorList(vec_list):
for item in vec_list:
print(item)
'''

def CreateMesh(num_param, data_set):
# debug prints, for flow control
debug_string = "num_param = " + str(num_param)
debug_string += " & data_set length = " + str(len(data_set))
print("Reaching CreateMesh with data: " + debug_string)

# make new mesh, add vertices using coordinates
Verts = []
for data_segment in data_set:
xfloat = float(data_segment[2])
yfloat = float(data_segment[3])
zfloat = float(data_segment[4])
unique_vertex = Vector((xfloat, yfloat, zfloat))
Verts.append(unique_vertex)

test_mesh = bpy.data.meshes.new("LPT_DATA")
test_mesh.from_pydata(Verts, [], [])
test_mesh.update()

new_object = bpy.data.objects.new(lpt_name, test_mesh)
new_object.data = test_mesh

scene = bpy.context.scene
new_object.select = True

# looks like i could refactor CreateMesh and CreateWireBounds
def CreateWireBounds(box_coordinates):
print("making wire bounding_box object")
# printVectorList(box_coordinates)

# hardcode some edges
Edges = [   [0,1],[1,2],[2,3],[3,0],
[4,5],[5,6],[6,7],[7,4],
[0,4],[1,5],[2,6],[3,7]]

b_mesh = bpy.data.meshes.new("WIRE_BOX_DATA")
b_mesh.from_pydata(box_coordinates, Edges, [])
b_mesh.update()

box_object = bpy.data.objects.new("WIRE_BOX", b_mesh)
box_object.data = b_mesh

scene = bpy.context.scene
box_object.select = True

def ConstructBoundingBox():
# ShowBoundingBox(True) # uses builtin method, optional.

bpy.context.scene.update() <b># necessary sometimes</b>
box_Vector = bpy.data.objects[lpt_name].dimensions
print(box_Vector)

# determin bounding box
'''
box_coordinates = []
bounding_box = bpy.data.objects[lpt_name].bound_box
for vtex in bounding_box:
v_coordinate = Vector((vtex[0],vtex[1],vtex[2]))
box_coordinates.append(v_coordinate)
'''
box_coordinates = bpy.data.objects[lpt_name].bound_box
# make bounding box 3dGrid
CreateWireBounds(box_coordinates)
return

def MakeHistogram(pnum, ldata):
print("Entering MakeHistogram function:")
print("--Total data points stored: ",len(ldata))
# print number of unique tracks
# find longest track
# find shortest track
# print histogram of tracks & lengths
return

def InitFunction():

def isBeyondToken(line_num):
if use_tokens == False: return True
if line_num &lt;= break_token: return True
else: return False

num_lines = 0
line_checking_list = [] # list to check for consistency
line_data_list = [] # list to append the various data values onto

dataset = open(data_directory + datafile)

for line in dataset:
if not isBeyondToken(num_lines): break # set maxmum read point
if num_lines % skip_value == 0:
if (num_lines &gt;= start_token) or use_tokens == False:
items = line.split(",")
line_checking_list.append(len(items))
line_data_list.append(items)
num_lines += 1
dataset.close() # to be polite.

# detect anomalies first, before getting hopes up.
set_check = set(line_checking_list)
if len(set_check) == 1: # means no variation
printTimeStamp()
print("Number of lines in original dataset: ",num_lines)
print("Skipping every",skip_value, "lines")
num_parameters = list(set_check)[0] # gives parameters per line
MakeHistogram(num_parameters, line_data_list)
CreateMesh(num_parameters, line_data_list)
ConstructBoundingBox() # manually for grid drawing
else:
print("There exists variance in the data, won't proceed")
print("At least one line contains unexpected data")
return

return

InitFunction()

``````

@mgibert, could you shed some light on the time intervals here, are they 1 ms?

could you shed some light on the time intervals here, are they 1 ms?

In this particular movie, the acquisition frequency was 2900Hz which gives you a time interval of about 0.35ms

here is some more work on it. it inserts icospheres and per track gives them loc(x,y,z) keyframes.
http://digitalaphasia.com/code/Python/Physics/csv_to_animation.py

what it doesn’t do yet

• combine interrupted tracks (not impossible, but will take me a little time to finetune)
• hide non moving spheres. (easy, i think…)

I tried this with particles on a path but couldn’t find any working examples of controlling the position vs time that way

• this means if you import all the csv data, that it will take atleast 50 minutes on a home computer simply to make the 1.5m keyframes.

Hi Zeffi!
I fixed the non moving particles issue:

``````def MakeHistogram(pnum, ldata):
print("Entering MakeHistogram function:")
print("--Total data points stored: ",len(ldata))

multi_list, temp_list = [],[]

# get first track num, this is a one off necessity
tracknum = ldata[0][0]

# start digging through the data
for i in range(len(ldata)):
tracknum_line = ldata[i][0]
if not tracknum_line == tracknum:
#f = [0] * 11
f=ldata[i-1]
f[2]=100000
f[3]=100000
f[4]=100000
temp_list.append(f)
temp_list[0][2]=100000
temp_list[0][3]=100000
temp_list[0][4]=100000
multi_list.append(temp_list) #no more items for this track
temp_list = [] # re-init storage list

temp_list.append(ldata[i])
tracknum = tracknum_line

# if last line, then add templist to multi_list
if i == len(ldata)-1:
multi_list.append(temp_list)

return multi_list
``````

It is not very nice because I am not a python pro! But it works! Basically, they are sent out of the camera view.

Now it would be great to give them a nice color/material… I did that in the first code I posted here, but I am not so sure that this is the best way to do it… Do you have an idea?

Great job!

Yep that works, but I was hoping not to have to move the particle out of the visible space.

My approach to the animation using keyframes feels like overkill. Especially if it takes so long to set the keyframes. There has to be a neater way, it might be easier/faster to keyframe the position on the path ( 1 value) rather than the 3value location (x,y,z). Maybe keyframing a material colour + opacity is not a crazy idea [noparse]:)[/noparse], i have not yet tried. Probably tackle joining tracks first, as it’s a little more interesting problem (for me)

As far as i can tell these are the options

• keyframe the spheres out of camera view
• move objects to a non visible layer when not in motion
• animate opacity
``````
import bpy
import time
from mathutils import Vector
# purposely coded verbosely in places. relax :)

'''
about the data being processed here:
we have 11 elements per line. Per particle snapshot they inform us about:
Element[0] = track num
Element[1] = timestamp = num * 0.35ms (from the start)
Element[2,3,4] = Position Px,Py,Pz
Element[5,6,7] = Velocity in Vx, Vy, Vz
Element[8,9,10] = Acceleration in Ax,Ay,Az

# while testing

break_token
322 is the first track
330 is first 2 tracks
649 is track 1, 2 and 3
671 = 1,2,3,4

'''

# setup reading location, and global variables
datafile = 'test_data_trkflag_timestamp_1.txt'
# lpt_name = "LPT_REP" # name of object for scene

# optional, use these to look at a subset of the data
skip_value = 1 # default 1, is full set
use_tokens = True # True = set a cutoff point, False = entire dataset
start_token = 0 # start importing from this line, 0 for start
break_token = 27671 # don't import beyond this line or until dataset ends.
last_frame = 0.0

# spline setup
w = 1 # weight
minimtl = 22 # default 1, track segments minimally present before importing

def printTimeStamp():
# divider + timestamp
print("
" + "="*19 + time.ctime() + "="*19)

def printListed(vec_list):
iter = 0
for item in vec_list:
print("track: ",iter, "="*12, "has", len(item), "data points")
for instance in item:
print(instance[0:5])
iter += 1

'''
def CreateMesh(num_param, data_set):
# debug prints, for flow control
debug_string = "num_param = " + str(num_param)
debug_string += " & data_set length = " + str(len(data_set))
print("Reaching CreateMesh with data: " + debug_string)

# make new mesh, add vertices using coordinates
Verts = []
for data_segment in data_set:
xfloat = float(data_segment[2])
yfloat = float(data_segment[3])
zfloat = float(data_segment[4])
unique_vertex = Vector((xfloat, yfloat, zfloat))
Verts.append(unique_vertex)

test_mesh = bpy.data.meshes.new("LPT_DATA")
test_mesh.from_pydata(Verts, [], [])
test_mesh.update()

new_object = bpy.data.objects.new(lpt_name, test_mesh)
new_object.data = test_mesh

scene = bpy.context.scene
new_object.select = True

'''

def GenerateObjects(multi_list):
# printListed(multi_list)
maxlen = len(str(len(multi_list)))

def MakePolyline(pname, objname, rawList):
cList = []
for elem in rawList:
xfloat = float(elem[2])
yfloat = float(elem[3])
zfloat = float(elem[4])
unique_vertex = Vector((xfloat, yfloat, zfloat))
cList.append(unique_vertex)

# create path data and add it to a scene object, then link to scene.
curvedata = bpy.data.curves.new(name=pname, type='CURVE')
objectdata = bpy.data.objects.new(objname, curvedata)
objectdata.location = (0,0,0) #object origin

# trajectory settings
curvedata.dimensions = '3D'
curvedata.use_path = True
curvedata.path_duration = 100

# takes first and last timestamp for track.
first_vis_frame = float(rawList[0][1])
last_vis_frame = float(rawList[-1][1])

# animate path
curvedata.eval_time = 0
curvedata.keyframe_insert(data_path="eval_time", frame=first_vis_frame)
curvedata.eval_time = 100
curvedata.keyframe_insert(data_path="eval_time", frame=last_vis_frame)

# set to linear, hax!
action_name = pname+"Action"
this_keyframe_collection = bpy.data.actions[action_name].fcurves[0]
for keyframe in this_keyframe_collection.keyframe_points:
keyframe.interpolation = 'LINEAR'

# because we have so much exact data 'POLY' suffices.
polyline = curvedata.splines.new('POLY')
polyline.use_cyclic_u = False
polyline.use_endpoint_u = True

for num in range(len(cList)):
x, y, z = cList[num]
polyline.points[num].co = (x, y, z, w)

# insert icosahedron, attach to curve object
particle = bpy.context.object

pconstraint = particle.constraints.new('FOLLOW_PATH')
pconstraint.target = objectdata
pconstraint.use_fixed_location = False
pconstraint.forward_axis = 'FORWARD_Z' # ?
pconstraint.up_axis = 'UP_Y' #?

# deal with very last frame some other way.
time_and_state_settings = ( (0, True),
(first_vis_frame-1, True),
(first_vis_frame, False),
(last_vis_frame, False),
(last_vis_frame+1, True),
(last_frame, True))

for time_val in time_and_state_settings:
current_frame = time_val[0]
bpy.ops.anim.change_frame(frame=current_frame)
bpy.context.active_object.hide = time_val[1]
bpy.context.active_object.keyframe_insert(  data_path="hide",
index=-1,
frame=current_frame)

# gasp !
return

iter = 0
for track in multi_list:
curvenum = str(iter).zfill(maxlen)
pname = "mycurve_" + curvenum
objname = "CurveName_" + curvenum

# to adjust the minimum track length to import, modify minimtl
# in the global variables above.
if len(track) &gt; minimtl:
MakePolyline(pname, objname, track)
iter+=1

print("making:",iter,"tracks")
return

def MakeHistogram(pnum, ldata):
print("Entering MakeHistogram function:")
print("--Total data points stored: ",len(ldata))
multi_list, temp_list = [],[]

# get first track num, this is a one off necessity
tracknum = ldata[0][0]

# start digging through the data
for i in range(len(ldata)):
tracknum_line = ldata[i][0]
if not tracknum_line == tracknum:
multi_list.append(temp_list) #no more items for this track
temp_list = [] # re-init storage list

temp_list.append(ldata[i])
tracknum = tracknum_line

# if last line, then add templist to multilist
if i == len(ldata)-1:
multi_list.append(temp_list)

return multi_list

def InitFunction():

global last_frame

def isBeyondToken(line_num):
if use_tokens == False: return True
if line_num &lt;= break_token: return True
else:
return False

num_lines = 0
line_checking_list = [] # list to check for consistency
line_data_list = [] # list to append the various data values onto

dataset = open(data_directory + datafile)

# should be rewritten, it's not very pretty
for line in dataset:
if not isBeyondToken(num_lines):
break # stop importing if beyond
if num_lines % skip_value == 0:
if (num_lines &gt;= start_token) or use_tokens == False:
items = line.split(",")
line_checking_list.append(len(items))
line_data_list.append(items)
print(items[1])
if float(items[1]) &gt; last_frame:
last_frame = float(items[1])
print("current last frame:", last_frame)

num_lines += 1
dataset.close() # to be polite.

# use break token to set the end of the animation
print(last_frame)
bpy.context.scene.frame_end = last_frame

# detect anomalies first, before getting hopes up.
set_check = set(line_checking_list)
if len(set_check) == 1: # means no variation
printTimeStamp()
print("Number of lines in original dataset: ",num_lines)
if skip_value == 1: print("not skipping any lines")
else:
print("Skipping every",skip_value, "lines")
num_parameters = list(set_check)[0] # gives parameters per line
multi_list = MakeHistogram(num_parameters, line_data_list)

GenerateObjects(multi_list)

else:
print("There exists variance in the data, won't proceed")
print("At least one line contains unexpected data")
return

return

InitFunction()

``````

so, that creates curves from the datapoints, and hides (From view) the icosphere when it isn’t moving. interesting, but headwrecking. (this version here performs curve hiding (hides paths))