Blender - 3d Point Data usefull??

Any one tried to use Laser scan Point data in Blender yet?

I have been playing with point data of buildings and machines at work in blender and I am having some success.
So far I have made use of a basic script that takes a TXT file with XYZ data and loads all the point data into blender - perfect!

What I want to know is can a script be made that would also use extra point data (RGB values) for each point and is there a way that blender can render the information - pref. without using the Halo effect.
I don’t want the points to scale in size as they move towards the camera.(Halo problem)
Also I only want to use the points not a mesh.

Some great effects could be achieved with this technique.

Ideas anyone?

Thought I would post up some sample files for people to make use of.
It is a curved surface section of a simple sphere.

I have 2 files - one is a basic .XYZ points text file
the other uses extra information cells to create the RGB colour levels of each point.

click here for sample image

click here for sample point fileshttp://picasaweb.google.com/cuznerdexter/PointCloudUpdatesRenderingPoints#5243637564939588210

The sample point files have not been set visible.

Meshlab (http://meshlab.sourceforge.net/) contains a couple of algorithms that can construct a mesh from a given point cloud.

If you want to implement a simple algorithm yourself, look into Delaunay triangulation (http://en.wikipedia.org/wiki/Delaunay_triangulation).

After you have a mesh, it should be easy to render.

I have already got and used meshlab (and many other tools) to create meshes for cloud data projects.

This is not what I need this time.
I am not interested in “meshing” the surface. Instead I am looking at importing the raw cloud into blender as only the Points.
These points in our software at work each have a unique RGB value.

The reason for all this is so I can use Blender to animate scenes using point data instead of “Meshes”

I know blender can import the raw points but they are not visible on render without using “Halo’s” - these are not any good to me.

I am trying to learn python (and c++) hoping that I or somebody else can make something useful that imports the points with the RGB values and overrides the render in blender so the points are rendered as points, not as halo’s.

I believe if this was possible in blender (and I am sure that it is!) - some amazing effects could be achieved.

If you dupliverted (http://wiki.blender.org/index.php/Manual/DupliVerts) a dummy object to the point cloud, then you could easily render it. For a quick test you could use a cube for this purpose so that a tiny cube would appear to location of each point of the point cloud.

One possible way to handle colors would be to convert this dupliverted data to real and use vertex colors.

I thought about that Idea a while ago, was not sure if it would work.
Now I have tried it and concluded my concerns.
I created six points in an empty object and then Dupliverted a cube primitive object to the points. Instead of giving me six solid cubes on the points I got six points with six clusters of points with NO faces!
So I ended up with a point cloud containg even more points than before and still unusable.

Back to the ideas board, for me.

cuzner,

If you render with an ortho camera, are the Halo points different sizes?

You might want to make a mesh in Meshlab just so you have the topology to make Dupliverts work - you don’t need to render the mesh itself.

What about particles?

RS

Hello all!
I would like to know if this subject has been solved…?

I would like to do a similar thing:
I have the coordinates (xyz) of different points (or particules if you want), and one more value called Color for each one of them.
I would like to “plot” at each one of these xyz position a sphere which has a color somehow proportional to the value of the parameter Color i just talked about.

Then I would like to make a movie out of this because I have plenty of the “frames”, but this should be ok if I have the first step.

Thanks a lot!!!
Best regards,
Mat

Hi All,

cuzner, would you mind posting your sample scripts (the link above doesn’t work).

Also, like users mgibert and cuzner, I’m trying to plot 3D points (with xyz coordinates), the only difference being that I would like to place a straight line (or tube, cylinder, stick, whatever you want to call it) between pairs of points. And each line gets a unique color (so the xyz coordinates for each pair, as well as each pair’s color value gets called from a txt file). Does anyone have any scripts that could do this? I am a newbie and not python savvy!
Thanks
D

Hi!

Has this been resolved? I’m facing the same issue trying to render the point cloud from a Photosynth export.

  1. The ply is imported and I get a vertex cloud, but the color is lost.
  2. Using a test model with vertex color and dupliverts or particles I can not project the vertex color onto the particles/dupliverts.

I also tried UV map with a mesh model, but the particles or dupliverts do not take the UV map colors.

Any ideas? Thanks!

you cant texture a vertes. you nave to have a face (polygon) to texture. try useing this script http://blenderartists.org/forum/showthread.php?t=144504 to skin your point cloud. it’s create the faces you can texture. you can also use iv mapping after you have skinned the cloud to make it a mesh. a vertex is just a 3d location, it is not and actual surface so you cant texture it.

Hi Guys,

I have been using this code to plot my particles:
http://jmsoler.free.fr/didacticiel/blender/tutor/cpl_modelmshapes.htm

Nevertheless, now it does not work anymore with blender 2.5 (bpy VS Blender)…

But it seems crazy that there is no native function like that! sphere(Radius,[x,y,z])!

Anyone has a solution for blender 2.5?
Thanks a lot in advance,
Mat

there is a primitive shpere in 2.5

what’s the problem ?

If you have any suggestions to make this plot nicer, I ll be very happy to hear them!
Thanks in advance!

hey! i’m into this kind of stuff. if you want modifications can you describe what you’re interested in?
red = fast
dark blue = slow ?

Yes indeed, blue is slow and red is fast.
As for the improvements I do not really know… I am locking for ideas…
For example, I was thinking that the time could be “rendered” by some alpha effect… The older positions are nearly invisible while the latest is non-transparent at all.
But other ideas are welcome!
Thanks!

Zoom in a bit, set a depth of field and use some focus/blur techniques to give a sense of space.
Consider a slight motion blur effect as a function of speed and direction (Vector?).

At that point you can probably ditch the background, because it isn’t part of the data set anyway. (and probably only there to give a sense of depth?)

I would love to do more visualization of data, if you have any sample sets that would be great :slight_smile:

Hi Zeffii,
It’s my pleasure to share this with you!
You can get a sample data file HERE.
What it contains is described on my website.

If you like data visualisation, you can also check out the little openGL app that we wrote with a student to visualise those data (only available for mac osx):
http://www.eulerview.gibert.biz/

Keep me in the loop! I’ll be very interested to see how you represent those data!
Thanks!

It might be a week or thereabouts before i get to it, thanks mgibert! interesting occupation there at the Max Planck Institute!

@mgibert

basic vertex loading, more after dinner. i’m thinking about ways to detect what particles are simply the same particle at a different time in space. At present it’s just one mesh for debugging.


import bpy
import time
from mathutils import Vector
# purposely coded verbosely in places. relax :)

def CreateMesh(num_param, data_set):
    # debug prints, for flow control
    debug_string = "num_param = " + str(num_param)
    debug_string += " & data_set length = " + str(len(data_set))
    print("Reaching CreateMesh with data: " + debug_string)
    
    # make new mesh, add vertices using coordinates
    Verts = []
    for coordinates in data_set:
        xfloat = float(coordinates[0])
        yfloat = float(coordinates[1])
        zfloat = float(coordinates[2])
        unique_vertex = Vector((xfloat, yfloat, zfloat))
        Verts.append(unique_vertex)
    
    test_mesh = bpy.data.meshes.new("LPT_DATA")
    test_mesh.from_pydata(Verts, [], [])
    test_mesh.update()
    
    new_object = bpy.data.objects.new("LPT_REP", test_mesh)
    new_object.data = test_mesh
    
    scene = bpy.context.scene
    scene.objects.link(new_object)
    new_object.select = True
        
    # determin bounding box
    # make bounding box 10% bigger
    # make bounding box 3dGrid


def InitFunction():
    # setup reading location
    data_directory = '/home/zeffii/Downloads/Physics/'
    datafile = 'data_test.txt'
    
    num_lines = 0
    line_checking_list = [] # list to check for consistency
    line_data_list = [] # list to append the various data values onto
    
    dataset = open(data_directory + datafile)
    for line in dataset:
        items = line.split(",")
        line_checking_list.append(len(items))
        line_data_list.append(items)
        num_lines += 1
    dataset.close() # to be polite.
    
    # detect anomalies first, before getting hopes up.
    set_check = set(line_checking_list)
    if len(set_check) == 1:
        print("
" + "="*17 + time.ctime() + "="*17) # simple divider + timestamp
        # we know the set is of size one and parameters per line
        num_parameters = list(set_check)[0] 
        CreateMesh(num_parameters, line_data_list)
    else: 
        print("There exists variance in the data, won't proceed")
        print("At least one line contains unexpected data")

InitFunction()