Transforming UV coordinates gets slower

I use a python script to transform the UV coordinates of an object and it works great however it gets a little slower each iteration. The first run it takes about 100ms to execute and this increases by a few milliseconds each time I run it. Before long its taking seconds to update.

The code that updates the UV’s is as follows, as you can see I’m timing only the section that updates the UV coords.

    objectData = object.data
    object.data = None

    start = time.time()

    for uvData in objectData.uv_layers.active.data:
        uv = uvData.uv.to_3d()
        uv.y = 1.0 - uv.y
        uv = uvMatrix @ uv
        uv.y = 1.0 - uv.y
        uvData.uv = uv.to_2d()

    print("D: ", time.time() - start)

    object.data = objectData

The program does the following steps in a loop:

  • Copy the object.
  • Update the UV’s
  • Render
  • Delete the object and its associated mesh

I have confirmed that the number of UV’s that need updating each iteration are the same.
I have printed the size of all the collections in bpy.data… and they do not increase and also in the scene so I don’t think it is leaking objects, meshes, cameras, etc.

As far as I can tell everything is the same each iteration it just gets consistently slower and I don’t understand why.

Should have said my Blender version is 3.1.2 and I’m using it to generate large numbers of relatively small images for use in machine learning.

I don’t think there’s anything in the provided snippet that’s the cause of increased time.

Each object copied is automatically added to bpy.data.objects, or bpy.data.meshes if it’s a mesh.
If you’re not explicitly deleting the copies, it’s possible the number of objects you’re processing is simply increased each time.

Note that even if copies don’t appear in the scene, they can still reside in bpy.data.objects.

1 Like

That’s what I was expecting it to be a collection of some kind getting bigger but I print bpy.data.objects and bpy.data.meshes and they have the same number of items in them each iteration.

I have timed every line in the script and its that for loop that gradually takes longer. I do wonder if objectData.uv_layers.active.data is actually a map lookup that is gradually getting slower rather than the loop itself.

I will try to build Blender locally and see if I can break in that loop and see what its actually doing.

Thanks for the help.

UV loops are custom data layers. The underlying data is placed in C arrays, and the RNA/python facing loops themselves are done using C iterators.

Blender does use hashing for ID types like bpy.types.Object. Each copy creates a new ID, which must be ensured to have a unique ghash, which requires Blender to loop over the existing IDs. This is probably closer to O(N²) operation where N is the number of ID instances in the blend file in memory.

It might be that copying data is tagging something dirty and the RNA access on the UVs is triggering an update, but I’m not sure without looking.
Reading, modifying and writing uv coordinates can be made faster by using numpy. Blender’s property collections support writing to and from numpy arrays using foreach_get and foreach_set methods.