Mesh cleaning script


(noidtluom) #1

I have this little snippet which cleans all the objects in a file:


import bpy

selected_objects = bpy.context.selected_objects
for selected_object in selected_objects:
    bpy.context.scene.objects.active = selected_object
    bpy.ops.object.editmode_toggle()
    bpy.ops.mesh.remove_doubles()
    bpy.ops.mesh.tris_convert_to_quads()
    bpy.ops.mesh.normals_make_consistent()
    bpy.ops.object.editmode_toggle()

Unfortunately it is quite slow for a file with, say, several thousand objects. Is there a way to parallelise, or make this faster?


(Oscalon) #2

I tried it out, but it doesn’t seem to actually remove any double verts? Always says Zero removed.

EDIT: Seems to work in some files but not others. No clue why.


(FreeAccess) #3

Hello,

To convert FBX files containing many objects, I use this script that runs in few seconds:
- convert tri to quad
- [Addon HardOPs] ==> Operation / Clean Mesh

This may be useful . . . :yes:



(TeaCrab) #4

Remove double works on a distance basis. If 2 verts has a larger distance than the threshold given to the ‘remove double’ operator, it won’t remove anything.

Converting tris to quads using the automatic algorithm might not be a good idea unless you know the result is going to be what you want.

It also might depend on the scene size.


(-HENDRIX-) #5

It should be worth noting that using operators often forces a whole scene reload.

If you can write the equivalent code with low-level access, you can bypass those updates and do one update in the end.
Basically, you would have to rewrite the code of each operator using “normal” data access. This will inevitably involve writing much more code, but could be worth the effort if you need to do such a task often.


(dudecon) #6

True. Though there’s some low hanging fruit to harvest first.
The code as-written operates on each object regardless of whether they are even meshes at all.
It also operates on every instance of each block of mesh data, so if you have lots of duplicates (meshes that share the same data) it will waste a bunch of time cleaning the same data over and over.
And finally, the failure to remove some doubles is because the “remove doubles” operation only operates on selected vertices. So we need to select all vertices first.
My version, incorporating these changes, is as follows:

import bpy
checked = set()
selected_objects = bpy.context.selected_objects
for selected_object in selected_objects:
    if selected_object.type != 'MESH':
        continue
    meshdata = selected_object.data
    if meshdata in checked:
        continue
    else:
        checked.add(meshdata)
    bpy.context.scene.objects.active = selected_object
    bpy.ops.object.editmode_toggle()
    bpy.ops.mesh.select_all(action='SELECT')
    bpy.ops.mesh.remove_doubles()
    bpy.ops.mesh.tris_convert_to_quads()
    bpy.ops.mesh.normals_make_consistent()
    bpy.ops.object.editmode_toggle()

If this still runs too slowly for you, consider -HENDRIX- suggestion to re-bind the ops calls. The “remove_doubles” is probably the main culprit, as it prints to the console, which takes forever.