I’m writing an addon to operate our OEM vehicle ingestion workflow.
This process consists of lots of custom data with collection properties on data types of collection, scene, and mesh, (ie tons of pointer / collection properties)
I wish I could share a scene, however the data is NDA, and can not be shared.
I wrote out a 14.3GB crash dump saved from running this via visual studio on debug, but not sure if that would be of use on it’s own.
Sadly the blender codebase is very foreign looking to me given my lack of C++ knowledge (I use primarily python and C#), and my attempts to debug this have hit a wall.
Maybe this screencap of the call stack will give someone some insights?
The issue seems to be when adding the material information to the mesh data (not to the material slot… just to my own custom collection property).
My custom collection property has 3 members
[ PointerProperty(bpy.types.collection), PointerProperty(bpy.types.collection), PointerProperty(bpy.types.material)]
And each object may have 0 or more entries in this collection (most have < 3).
The crash occurs when I’m very near complete on my dataset. We’re talking ~20 meshes left out of between 2000 to 5000 meshes (10,000 original but I consolidate duplicates found in the CAD).
I’ve tried multiple datasets and dozens of permutations regarding which order the groups are processed in, so it does not seem like this geometry itself is in question (ie there is no “bad” mesh lurking in the data). The only constant is that at some point I get a
stack overflow in blender.exe either while making the data or once the file tries to save to disk if it processes completely (regardless if this data is set by script or by hand).
The other data (like that on my 500+ bpy.types.collection) imo is far more expansive and dense, but for some reason it takes the rest of it fine.
I know without the addon or files I’m using (which I can’t share) this is more than tricky to diagnose. I suppose what I’m looking for is perhaps others with similar experiences and maybe some guidance about how to proceed. It feels like this is an issue with the code base and likely happens to others working with larger datasets given just how predictable it is regardless of my input and across so many blender versions.
Maybe the c++ loops through objects seen in the call stack are too much for the stack size to take once this much pointer data and properties are involved?
The overall size of the blend file is only about 3Gb assuming it saves to disk without crashing (ie partially processed) So we’re not talking that much geometric data really.
Thank you for any insights you can provide!
Windows 10 x64
Blender 3.1, 3.2, 3.3, 3.4alpha