Blender and large sets of data

And how can he do that? With a 2.8 build?

Cheers.

Did he mention what version he use ?
Is the new depsgraph not more available in the 2.8 build with the console --enable-new-depsgraph ?

It seems like i am not up to date.

Yes, this is how the blender importer works too. You can export an entire scene and load it back up. All the meshes are separate and each one has an Alembic cache modifier on it.

Yeah, if I’m ever having an issue with a scene I try and export each object separately to see which one is causing the issue.

If it is not animating, can you combine the objects into a few big objects. I think thousands of object is harder for Blender to deal with than a few high poly objects.

Reviving this to ask how to import an 11GB .abc file into Blender 4.0, 4.01? It freezes after 14%.
I am 300% sure there’s some sort of hack when importing .ABC files, because 4.0 dialogue is LONGER than the dialogue on 4.1
Question: How to properly import an 11GB alembic file?

I would quite honestly suggest that … by asking to “import” an eleven-gigabyte file … you are simply asking too much of any software. You probably need to find a way to simplify the problem upstream.

3 Likes

What does this 11 gig file consist of?

Is it possible to split it up into smaller parts, sections, objects, etc.?

nope. Not on Houdini. Or Maya. Or Clarisse.
So, no. Not of “any software”.

It’s a liquid sim, with foam, spray.

I wonder if there are any other formats than Alembic for this.

Open VDB ?

Alembic files are usually very large. Though I never worked with Open VDB.

PS: No Blender does not currently handle large data sets at any point along the chain at the moment.

Hopefully it will change in the future.

Another option might be to cache it out in smaller sections and stitch it together in Blender based on frame start and end of each file.

Not too long ago i imported a ~15 GB big physics sim into Blender, but since these where rigid bodies and I already had them split up into groups in Houdini they came separated as 3 files.
I had very little problems to import them (repeatedly, multiple versions for test renderings), Blenders UI froze on import and it took a couple minutes to get all of these in, but I had (only) 2 crashes out of 15+ attempts.
This was on an slightly older version, 3.3 or 3.4.
I have a fairly decent machine with 64GB of RAM.
While I have no reason to assume that alembic import regressed in newer versions, you can try an older version…

If its a liquid sim then that means mesh data that changes topology and position every frame, no instances that can be pushed around using just transforms - it’ll be very heavy either way.
While VDB’s can cache mesh data, I’ll never heard anybody anywhere talking about using it for this kind of use case. Is VDB optimized for this?

This is actually the Ideal use case for using an USD tpye workflow where the data is referenced from disk at rendertime, but Blender doesn’t do that yet.

That is a good idea and probably the best solution.
Isn’t it possible to write an alembic cache not only as one big file, but as a frame sequence?
Shouldn’t that alone spread the load, instead of front-loading…?
I have no idea, what I do know is that it makes it easily possible to control the amount of frames Blender loads externally by pointing it to a folder with the cache, and only have a chunk of them in there.

1 Like