Why Blender needs Single Vertices Color support

I’m pissed off… sorry.

I’m working on integrating Photogrammetry Tools into Blender. (Photogrammetry = getting 3d Data from Images).

Basically a 3d Scanner from Images.

Everything works fine, but the main issue is: Blender has no vertex colors for single vertices… This means concretely: This Point Clouds need to be cleaned up - delete “noise” vertices. I’d like to do this in Blender - But without having Color on those Points it’s almost impossible to do…

This single “issue” renders Blender pretty much completely useless for anything Point Cloud related, which is imo a shame. (together with normal calculation every time edit mode)

I’m way not the first confronted with this issue - i have seen forum entry dating back 2006…

So why the hell do we don’t have Color for single Vertices??? Is this a very difficult thing to do?

Nowadays with Lidar Scanner Data widely used in VFX - I don’t think Point Clouds are something such exotic…

I’m really about to ditch Blender for those kind of things… Which i think is too bad, since we could have nice 3d Models from Photos within a few weeks (for osx at least atm).

I really don’t know how to go about things like this… Is there any use of contacting BF about this? What do you think? Should we somehow make some noise for this? Or should i just leave Blender for such things?

Someone correct me if I’m wrong, but doesn’t the newly revamped develeper/bugtracker page allow suggestions as well? I’d imagine having this type of point cloud functionality would be immensely useful, especially with the upcoming Gooseberry project.

good luck… and fingers crossed:)

Here’s a link to a submission on the bugtracker regarding what you’re talking about but it was closed since it’s not considered a bug but a feature request


and a patch


hope this "point"s you in the right direction.

Thanks for the heads up.

I have not found any suggestions thing on developer.blender - they are all meant to be used by developers/module members…

the patch is just to generate Point Clouds…

So why the hell do we don’t have Color for single Vertices???

The “vertex colors” are properties of faces, not vertices, so that you can have differently colored faces without splitting up the mesh. As a consequence, there cannot be a “vertex color” without a face.

It bothers me as well, because nobody else (that I am aware of) does vertex colors this way and it makes converting from one to the other ambiguous and destructive. Also, while the data is stored as an 8-bit/channel color, querying it through the API yields floating point values (afair Cycles uses those, so it’s four times the data at no benefit). It would be useful if the vertex colors were float-precision, not byte-precision.

The CustomData layer system actually allows for arbitrary per-vertex, per-face and per-edge properties, it’s more a matter of how the tools use that data. Ideally, all per-vertex data (weights and colors) could be interpreted and used as color or weight.

Is this a very difficult thing to do?

It would require changing pretty much all tools to handle both cases (per-face and per-vertex colors).

EDIT: By the way, Blender currently has the opposite issue with normals, which are per-vertex. If you had per-face normals, you could have sharp (or semi-sharp) edges without splitting the mesh.

Bastien is working on loop normals:

Currently blender makes no pretense to be able to handle point-clouds usefully, we don’t have it because no developer chose to sit down and make it work really well - adding vertex color per vertex is trivial. but then you want tools to manipulate it, api’s and viewport display… Theres nothing especially hard about it… just its not a priority for any devs AFAIK

A tip: use the vertex groups. Make 3 groups: red, green, blue. And you can weight the RGB color per vertex and per group.

a second i thought this would work - to display them in color! :wink: But you only can activate i vertex group at time, right? So it could be used to store the data.

I really need a way to display the color in order to clean those point clouds.

Edit: Plus Vertex Group/Weight does not get Displayed either if Vertex (no faces) only.

Completely useless for imported point cloud data, though…

You mean Andi’s comment or Vertex Color per Vertex?

for time being
can you cut you model in smaller pieces add faces then do you clean up !
there is a function in blender to add faces to point cloud but not very powerfull!

happy bl

You can assign a vertex to many vertex groups.
And I think it is possible to write a shader that will render the vertices by color from the vgroups.

I meant adjusting vertex group colors after that fact considering that you’d still be throwing away all of the captured color data.

This isn’t really practical. The point clouds you are dealing with when working with dense stereo photogrammetry or lidar can easily contain hundreds of millions of points. As a concrete example, the last set I scanned for the show I’m currently on took about half a day and generated in the neighborhood of 5 billion points of raw data. That was reduced to about 300 million points which were used to generate LOD meshes with 2-40 million faces that were sent to the downstream vendors for camera tracking and set extensions. That was by no means a large set, we collected about the same amount of data at another location in the morning and probably twice as much the day before.

As noted earlier, point cloud support has been brought up before both here and on the devel list, but none of the developers seem interested. I think that is a little unfortunate considering how common lidar is in VFX and the benefits it can bring to things like camera tracking and modeling workflows. I do however have to disagree with bashi’s wish to turn Blender in to a point cloud editor. Doing that well requires a lot of specialized features that I’m not sure would fit well in to the rest of Blender’s framework. The last thing Blender needs is yet another half finished toolset. That being said the specific features he is asking for: the ability to have unconnected vertices (particles) with the ability to store and render arbitrary attributes such as color or normal, are much less invasive and would go a long ways towards making point clouds useful in Blender. Again, think survey reference for camera tracking or modeling.

bashi, have you looked at CloudCompare? Currently it is the best open-source option for working with and editing point clouds. The tools for manually editing point clouds like you need are currently not great, however, the developers seem open to suggestions and this was something I was just commenting on in their forum [1]. I can also recommend commercial tools if you want, although expect 5 figure price tags.

[1] http://www.danielgm.net/cc/forum/viewtopic.php?f=14&t=191

heard that some of these data can be quit large
my only concern is can blender work with billions verts in first place ?


For this to work efficiently it would make sense to define a new object type, it can still define customdata per-vertex, vertex-groups and use shapekeys - but no need for normals (though float[3] could be added as a customdata layer still).

Ideally it could but it wouldn't need to. This kind of data is almost always going to get preprocessed in other applications so there is plenty of room for splitting things up in to manageable chunks. GPUs are also pretty good at drawing point clouds. One of my colleagues mocked up the most basic OpenGL point cloud viewer you could imagine. Basically just load points from disk and draw them to screen, no clever caching, no decimation, no fancy tricks. It was easily able to push a few hundred million points on a card with only 1 GB of VRAM before memory filled up and navigation slowed to a crawl. Being able to work with point clouds of that scale is definitely useful.

A lot of applications use normals to shade the point cloud based on the scene lighting so you can apply a solid color and focus on the geometry without regard to any per point color info that may exist. This is very useful in much the same way that Solid mode fills a different need than Texture mode for meshes. CloudCompare also has an implementation of Eye-Dome Lighting [1] that achieves much the same effect but doesn’t require normals. That would probably be fine for working in the viewport but wouldn’t be much help if you wanted to render the point cloud and have the shading match the lamps in your scene.

[1] http://www.kitware.com/source/home/post/9

@jedfrechette - interesting to know normals are used, so could be optional display option (like color).

Rather then new object types, a large update to the particle system is in the works, using alembic - so perhaps pointclouds can be static particles - maybe a better fit then having to add a new object type, and having the ability to edit particles will help other areas too.

I haven’t been following this closely but have been looking forward to the results. I think you’re right static particles are the way to go. Certainly that’s how I’ve seen people handle it most successfully in other applications. Here’s an example with partio+Maya+Arnold [1] and I’ve done much the same thing in Houdini.

I thought I would throw up this comparison I did awhile ago of a few different point rendering methods. Normals-based probably gives the cleanest representation but it is nice to have other options as, depending on the source, there is no guarantee you will get normals.

[1] http://blog.faro.com/2013/07/25/luma-pictures-and-the-focus3d-laser-scanner/