Can Blender get a package/bundle file format?

Often file sizes on a project get very large (>1GB) and every edit made to a file, even changing render settings saves the entire file every time.

It makes granular version control difficult. Version control addons often save the entire blend file so for a 1GB file with 100 checkpoints, it’s a 100GB project. Even version control that works on objects inside a .blend, saving the .blend saves every checkpoint.

Some modern apps use package/bundle formats where a bundle bit is set on a folder that makes it behave like a file.

This would be a folder structure that resembles the layout seen when switching the outliner to ‘blend file’ except small items can be grouped together like scenes, cameras, lights etc:

The file would have an extension like .blendpkg.

Inside can be .blend files but they’d be separated. Each mesh in a project can go in a separate .blend file (e.g project.blendpkg/Meshes/sculpt.blend) so if a project has a large sculpt object, it is saved in its own .blend file. Project settings like render setting would go in a separate one. Every time render settings are changed, only the settings are saved, not the sculpt.

This allows a few things like asynchronous file loading and saving. Projects would open instantly and large objects get loaded when they are ready. It allows granular version control. Inside a project, a mesh can have multiple versions that can be reverted back to in the same scene and these don’t get saved again when the main project is saved.

The format can optionally and transparently maintain a USD scene description inside the package. There is a desire to have universal scene and material formats to get away from the need to import/export between every single program but there’s always the risk that a common scene description will be missing some data.

A package format alleviates that problem as the USD part doesn’t have to cover everything, all of the data would be saved in the separated .blend files and the USD would be only as much as was implemented. But it would be enough to avoid an import/export step because the USD would be inside the .blendpkg and other apps could read/write it directly and the changes can get synced back to the .blend files inside Blender.

It can also maintain MaterialX (or game engine shader) equivalents when a .blend is saved. Even if the main Blender materials aren’t exactly matched, they would be much closer than manually setting up shaders.

Addons would be able to save custom data into the .blendpkg when the file is saved.

It allows adding compression to separate elements like shape keys and meshes to save a lot of space without having to compress everything.

This kind of file format could be made by an addon but it would be best as an internal file format so that it’s well supported across different Blender versions.

You can already do all this with linking, there’s no need to save everything in every individual file, just link it

Linked objects are difficult to work with. Even basic things like posing linked armatures isn’t possible:

It’s not possible to see the edits of a linked object in-place. The project has to be closed, the linked object edited, saved and then go back to the project.

If a character is in a scene and has to pick up a jar, how would a linked armature be modified to do this? Would this animation be done in the character file and then linked in the scene?

Select the linked armature, Make Library Override, and animate?

2 Likes

Select the linked armature, Make Library Override, and animate?

Thanks, that works for animation and keeps the file size small so this can be version controlled. Is there a way to do this for materials, all the buttons are greyed out?

If I switch the linked object’s material to object instead of data, I am able to reassign it and edit this, this feels like it’s not the way it should be done though. Shift-clicking the link button says it can’t be overridden and I can’t add new materials.

If I can assign new materials to library overrides, this would cover a lot of what I wanted from a package that can be version controlled. It’s mainly to avoid saving the largest assets over and over and I can keep copies of changes in the small override files.

There’s an error trying to do weight painting on a linked mesh:

This was trying to use a linked mesh to test different rigs on the linked mesh.

The materials are just shader networks, so already very small.

My first question is why your Blender files are over 1GB? Are you embedding all the texture/image files and if so, why?

Or are you baking large simulation data and storing that in the file? Again, why, that sort of stuff is usually done last and hence only one instance/copy of it. If it’s the same simulation data that for some reason you want to see across multiple files, then save it out as a cache file and link it in.

Do you know that you can save Blender files as compressed data, it’s both a preference setting and a save option.

1 Like

I’m not an expert with linking; in fact I’ve only started trying to use it more in the past month. But based on what I’ve experienced, I’m wondering if you’re trying to use linking in extended ways that aren’t optimal (not possibly intended.)

For example, I don’t link a mesh into a scene, then link materials, then a rig, then combine them all together and start trying to edit the mesh, the materials, and the rig.

All of those edits are done in the original files.

The materials are just shader networks, so already very small.
My first question is why your Blender files are over 1GB? Are you embedding all the texture/image files and if so, why?

It’s shape keys and vertex groups that take up the space, textures are external.

Do you know that you can save Blender files as compressed data, it’s both a preference setting and a save option.

I have tested using compression and it helps but the file sizes are still large. One without compression is 750MB and with compression is 480MB. It’s a big saving but the file is still way too big for version control. For version control, I’d like to have files under 50MB.

I read a bug report on the compression in Blender where it didn’t save as much as it could due to the order it saved in or something.

The main things I change a lot are rigs and weight paints, materials, lights, object placement etc. The meshes almost never change after a certain point.

An ideal setup would be to have a blender file with the mesh, shape keys, vertex groups. This can be 400MB+.
Then link this to a scene file where I can add materials, lights, rigs, animations etc and this would be < 50MB and the whole file can be version controlled.

I’d like to use version control like these:

The first makes entire copies of the .blend files so too big. The second saves checkpoints inside the .blend. This could be less space but saving the .blend saves all the checkpoints of every part.

Only the small parts that change often need to be saved and version controlled.

Hmm, without any subdivision modifiers, etc just how many polygons are we talking here?

For example, here’s a character I’m currently working on. Now not done yet, but even so, has a number of objects, full animation control rig, metarig, a mass of vertex groups on every object, etc. The one thing it currently doesn’t have is shape keys.

The above .blend file is 7MB. I handle folder/file/version control myself, using a naming system, etc that I developed. You can see the videos I made about that here: https://www.youtube.com/watch?v=4dgB-6K6_UE and https://www.youtube.com/watch?v=NCnmDKewrxc

2 Likes

While I think the library override system in Blender could be more flexible and user friendly, we’ve been using it at work alongside the asset browser and so far it’s been working pretty good, using linked assets the heaviest .blend file (a whole town) it’s around 200mb. Here’s a simple graphic of how we handle things, I hope it’s clear enough and helps somehow.

As for updating just a section of a scene, it’s easy when you have everything linked, because you can right click on the object (character, prop, etc) on the Asset Browser editor, select Open Blend File, do whatever edit you need, save the asset file and then reload it on your main file (automatic re-loading would be greatly appreciated though).

6 Likes

Hello,

It’s pretty weird that you’re getting these numbers, I just checked on a production ready character (semi realistic, cartoony character) , rigged with material , shape keys etc… my .blend is about 8.7Mb compressed, 32Mb uncompressed…

Do you have any clue why yours are so huge ?

About what your asking (bundle blend) it’s tricky as the .blend file is a dump of the memory for faster saving/loading… It’s unclear how much work it would take to have everything split, and it’s likely to make loading much slower…
The idea has been brought to the table on a few occasion but I won’t hold my breath on that…

Anyway, if your animation project is organised the standard way, that is one file per asset each of which is linked to some shot files, your concern should be more about renders than production files.

For instance, on that project : Boon & Pimento : Cartoon series made with blender
assets take 7.6GB (~2700 files .blend and textures)
shots take 32GB ( ~22 500 files : .blend, movie/storyboard, sound file) and this also include some versioning on shots, so one shot could have 2<–>40 versions…

That’s basically around 40Gb for production files, and from memory the renders was more about ~2 Terra-bites of data and it’s likely that we did some cleaning halfway during the project…
On top of that it was pretty lightweight as we didn’t do layer/pass rendering, only one beauty pass straight from blender. However we kept several versions of each shots until one episode was considered done…

Hope that helps to put things in perspective !

Have fun !

3 Likes

Around 500k (20x 25k meshes). The following scene give a good idea, it’s 500MB including textures with compression but 318MB without textures and no compression:

Using compression reduces the size sometimes but doesn’t always due to this issue:

Most of my files are 300-500MB with external textures. The larger ones around 800MB usually have multiple objects that would be better split apart and linked.

Even 300MB I find difficult to version control. 10 versions = 3GB.

That’s great, thanks for sharing this. So does the workflow go like this:

  • Model meshes first like PR_CAR_V1. Are materials applied here too or just basic textures?
  • Rig that version of the mesh in a separate file, I guess this has to be a copy of the mesh file as weight painting doesn’t seem to work on linked meshes. This produces PR_CAR_RIG_V1.
  • Copy latest rigged mesh to production folder without version number.
  • Link these to layout files.
  • Copy layout files to add animations. Is this using armature override? If rigging or materials are found to be wrong here, this would be corrected in working folder with new version then update production folder and relinked. Do animation overrides still work if it links a different file or does it have to be the same file name?

Most of my files are 300-500MB, similar to the file linked above, just more polygons. Ones larger than this have multiple objects that should be in separate files and linked.

Doing translation to a different format (like USD) would be slower as it has to iterate over the data and convert it all for every save. I’d expect using .blend files would have much less overhead as it would copy the whole data block in one go to another file in the same format.

It would memory dump the main data and have a placeholder/pointer to an object where there was a big set of data and this would get dumped into a separate file.

Using linking would give much of the same benefit as long as materials and rigging can be done this way. A single file just makes it easier to avoid the overrides and linking process.

The USD workflows that companies have been adopting sound like the way to go in the long-term. It would be nice to model a mesh in Blender, save a USD, open it in Substance, save textures and materials into USD and just reload in Blender. Then rig the mesh, add some animations, save USD and open it in a game engine. No intermediates.

Yes materials are made on the asset file.

Correct, the rigs are made on a copy of the modeling file.

Yes, we use library overrides (including rigs) for everything. You can make as many changes as you need to your original files, and any other file that is linking them will be updated without issues, as long as you keep the names the same (Collections, objects, bones, materials) there’s not a problem. But even if you need to replace an asset for another one you can just use Relocate, and it will work.

2 Likes

This sounds like the ideal solution in theory, but in my experience it’s still a long way from being the case, unless you can develop your own system to use USD between programs.

We tried this at the studio too, but unfortunately every program is developing its own version of USD, and even if it’s based on the original open source version provided by Sony there’s no guarantee that the USD implementation of one program will be the same as the implementation of another one. So in reality it’s pretty much like using different formats.

2 Likes

OK, I think there’s some confusion going on here.

So I downloaded that Einar file and yes, as a Zip file, it’s a 550MB archive, which once uncompressed it’s the same overall file size, given that all data in that ZIP file is already compressed, including the .blend file.

So leaving the .blend file saved with it’s standard application compression (which you may as well) then it’s 87MB. So 10 versions of that isn’t even 1GB.

Given that we are dealing with content creation here, on an overall scale, that’s still a pretty small amount of data.

2 Likes

Well keeping file sizes down is definitely a benefit of linking.

I would not call it the goal.

That said, there is no way you can manage data if you are making huge files that have to be versioned.

On the other hand there is almost no way to do important complex work if you don’t.

The question is where those larger files live on the chain.

And then there is the question of how to manage and edit linked files with overrides.

Then how to manage the entire pipeline so that it is nimble and functional.

So there is a lot to learn and study on this subject.

Some good tips here.

But also lots of information here:

2 Likes

USD is not the Holly Grail here. I am sorry to say.

Someone reported it was developed by Sony?

All mentions of it online since the beginning credit Pixar.

Pixar often develops in-house tools to help production.

That’s fine. But it needs to be developed for the public at large to be useful. For example their RenderMan product which is worth pointing out took years before it could even be adopted broadly used.

Arnold is another example.

Often it is the “elephant in the room”, factor that kills these initiatives before they even start.

It’s all well and good for a group of studio leads and pundits to sit in a room and expound the virtues of interoperability.

But it always falls apart once the idea leaves the ivory tower.

Because the broader development community is made up of free thinking individuals who have been motivated by creativity.

Creative approaches and unique solutions to the same problems.

And that is why no two programs process the data in the same way.

And the intrinsic reason a public solution to this private problem will never be successful

That is the elephant in the room.

Probably the only reason it is open source and promoted as a public initiative is for legal reasons that have nothing to do with the practical purpose of being adopted in a realistic universal way.

It will remain the province of the tightly knit group who developed it for their own mutual benefit.

But considering what I just pointed out, that is also a wasted effort.

My opinion of course.

Your mileage may vary.

I am not a fan of these automatic interop attempts since even Autodesk failed to get it to work between the apps they were developing.

Been there, tried that.

MotionBuilder and Softimage were so universally different, that even Autodesk could not make their interoperability tools actually work.

So I switched to Maya.

I was kind of forced to. I was happy with Softimage.

But I wanted to build a MotionBuilder pipeline between that and my main DCC so that I could share rigs and animations.

It is probably worth pointing out that even that with the “send to Maya” and “send to MotionBullder” interop tools were far from perfect. It was easy to break and some features did not work at all.

Complaints about this got me invited to the MotionBuilder beta.

But nothing was ever fixed and the beta was eventually closed.

So that is my experience.

So to me it is a bit ironic that the reason you need a USD is also the reason it won’t universally actually work.

EDIT Again!

Not done. So it is also worth pointing out that Tangent studios went under not only because but largely because they thought it would be a good idea to build a USD pipeline between Maya and Houdini.

Months of smart developers working on the problem and no solution before the studio was shut down and eventually went bankrupt.

4 Likes

Materials currently cannot be overriden, that’s the main limitation. I don’t know if it is actively worked on, but if you check the code blog you’ll see a recent post about overrides and their future improvements.

3 Likes

Right, my bad. USD was released by Pixar, I often mix them because Sony developed and released the Alembic format. :man_facepalming:

2 Likes