Can Blender get a package/bundle file format?

Just jumping in quick here and wanting to keep it brief and simple. Not just for this original question but others reading the thread who might be confused how this is done in an animation pipeline.

The answer to how to modify a library linked character rig for a specific action in a particular scene is to use links and constraints along with animation baking.

A phrase I was taught early on with 3d studio animation was this. " Never hack the rig " Because if you try to do that as an animator on a studio production with a linked rig it will most likely cause big problems unknowns and instabilities.

Basically the library linked rig needs to stay a closed locked in system and not to be messed with except by the designated animation TD. In many cases the rig can be modified and improved during production in a general way and this is a big advantage of the library linked system. Also it gives the option of an overlapping schedule where animation production can start with earlier simple versions of the rigs while the final render rigs are still being worked on and improved concurrently. Also of course it allows for lighter optinised versions of the rigs during animation that can be swapped out to heavier more detailed ones for render.

From an animator standpoint to modify the rig for a particular scene normally would require working with linked and constrained objects on top of the rigs own internal locked and closed hierarchy. Later this animation can be baked in to exist purely as keys and the constraints links removed. So another good phrase to think about animating a library linked rig in a scene is " Never hard parent. Only use constraints "

When I have worked in character animation productions we used constraints and baking all of the time. It’s hugely useful as well for animating walk cycles along paths and combining these sorts of actions into bigger actions. I always found this method very useful and even vital with quadruped walks and runs. It’s a part of the process I see least discussed though.

Anyway that was much more than I was meaning to write. Time to go.

5 Likes

Because it is not that much of a hit on file size, if you must make material modifications on Linked assets in the final scene the workflow we use is as follows:

Make sure the mesh (object level) has an override. Not just the collection. At this point you will have access to the material scroll down to chose alternate materials for the material slots.

At this point you have two options. One is to simply create empty materials on another object and then replace these materials (which can have the exact same names since the material info can not be touched) No need to make the originals fake, they can’t go away.

Then simply copy the nodes to the empty materials from the origionals.

Or simply append the materials from the asset and replace them.

They will have excatly the same names.

1 Like

That example is smaller, no additional outfits, fairly low resolution mesh (MetaHuman uses 10x the resolution), only around 30 shape-keys, these can be over 100. Even sculpt projects can be larger and some projects with textures painted in Blender will have the textures internal during production.

It’s not the initial size that causes issues though, it’s fine if it’s 1GB for one version, it’s the difficulty in saving deltas for version control.

Just now people typically use v1.blend, v2.blend, v3.blend etc but this isn’t proper version control as there’s no record of what these changes are. What changes were made in v2? Don’t know.

When version control is used in code, there can be a project that is large overall but when a change is committed, it’s very small and recorded with a note. The change record would look like:

  • 11:00-11/12/24 - changed shape of ears
  • 12:00-11/12/24 - adjusted skin material
  • 13:00-11/12/24 - fixed arm mesh
  • 14:00-11/12/24 - added UV map for body

There can easily be 10 commits in a day. Keeping a copy of the entire file for every change isn’t feasible and shouldn’t be needed. Most changes to data properties are very small.

Thanks for that link, that is very useful.

They could probably make it work with different interoperability tiers and the baseline tiers would have stricter rules. Baseline should work like FBX and GLTF and support meshes, curves, scene positions, light positions, animations. The basics.

Then higher tiers would have rules for shaders, particles, hair, cloth, volumes.

Substance wouldn’t have to support anything above the baseline in order to do painting but would have to support higher tiers for shaders/materials.

Every endpoint (game engine, renderer etc) doesn’t have to use the same shaders but it should be possible for every app in the pipeline to write the required shader graph for the endpoint, even if they need to use proxy graphs at intermediate stages for previews.

These initiatives have rarely gone well in the past but FBX and GLTF have been pretty reliable. They could never be source master files and I expect USD can’t either but it can be a master production file and some companies have fully adopted USD in their productions already.

@Richard_Culver described overriding materials on a linked mesh. This seems to work to an extent (and also works for weight painting):

  • right-click a mesh (green icon) in the linked object in the outliner and choose Library Override > Make > Selected
  • this unlocks the material drop-down and it can be swapped for another material

It’s not possible to add new materials though, which is a big limitation. All the needed materials and assignments to vertex groups have to be done on the original file but at least it’s possible to iterate the shaders. If someone has a large sculpted mesh, the material and lighting tweaks can be made in the much smaller linked file for the render.

1 Like

There are dozens of different FBX standards, many of which are proprietary. Every 3D DCC interprets the format differently, FBX working 90% of the time as an interchange format is half a miracle and half a ton of hacky code held together by duct tape and caffeine. Don’t get me wrong, I use FBX as an interchange format, but I definitely wouldn’t say it’s “reliable”. I don’t think a fully reliable interchange format currently exists. USD might fill that gap, but I haven’t seen that yet

3 Likes

I think it is best to think of linking the same as any tool.

For example knife or edge loops tool assumes you know about polygon flow, and how to optimize a model, or build it for articulation.

And you can spend only a matter of hours to master these tools, but years to study the process and get the practical work experience on how to use them.

UV Unwrap assumes you know how to mark edges for the best UV layout.

And UV is another subject.

And on and on.

Linking is a production pipeline tool - for teams. And of course for generalists to optimize a pipeline workflow.

It therefore requires knowledge of the best practices for a production pipeline which can take a long time to understand.

And each individual or studio might have their own needs.

But the point is that a production pipeline, just like any other process in 3D is both a science and an art. And the tools assume you know what to do with them.

Or assume you will develop some method for which the tool is used to accomplish that end.

1 Like

Ahh, OK, so a dense, unoptimised mesh then. Yeah, that would start to add up.

That’s where production tracking software comes in. Like Kitsu that the Blender Studio uses. However, yes, it still saves the whole file each time.

So really it’s more a case of storing the undo history within the file and saving that with a note along with it. That could either result in single very large files or a lot of manual management in deciding which undo steps or changes to save and which ones to discard.

Now I know some file systems and storage devices support a ‘previous version’ type system, even Windows can. Mind you most don’t have a notes option and even so, I’m pretty sure none of them are exactly light when it comes to data storage requirements. Especially if the data can’t be easily compressed.

1 Like

So very well stated!! :+1:
You describe my many years of frustration in trying to make people ,in certain software communities understand why any promise of a
“universal” exchange/export/import/hosting/link etc format is effectively a false promise.

Take ,for example, Daz and Reallusion
Both companies have (for nearly two decades) been
trying to position themselves as premium provider of prefabbed Characters for the major 3DCC’s and game engines with Reallusion offering a really good human IK ,nonlinear Character animation system and Daz offering well modeled Characters with great looking joint deformations etc.

Setting aside their licensing schemes ATM
(some which were deal killers in themselves)

Both companies have tried various data exchange methods for exporting to the 3DCC’s & game engines mostly FBX variants along with various in house plugins to lure potential buyers from the major 3DCC’s and game engine communities into their commercial content stores.

Both have utterly failed and will continue to fail.

Even if we take into account the psychology of entrenched habits & eco system comfort bubble mentality that exists in every 3DCC community.

There are just too many practical, real world variables at play
I mean Yikes!! if even the Mighty Autodesk could not
get full interoperability between Apps they owned and controlled

why would serious people allow some third party software’s “data exchange protocol” to become the single point of failure in their critical pipeline.

1 Like

The main thing is to get away from the idea of data exchange and import/export. This is trying to make everything compatible in a singularly maintained import/export addon. These exchange plugins being maintained by a few people or individuals is also one of their downfalls.

A tiered package format would alleviate these issues. USDZ is a package format, it’s a zip file of a folder structure (rename .usdz to .zip and extract to folder).

Higher tier data can be subject to varying support but baseline level objects (meshes, scene layout, textures) are easy to keep 100% supported in every app.

Each object type can have its own ‘addon’ for formatting. It wouldn’t be necessary that someone makes a conversion for the entire USD format. Someone makes one for meshes and it’s done forever no matter what changes happen to the format, the baseline objects won’t change.

There’s a formatter for the scene type, light types, shader graphs, rigs and can all be maintained by separate people. Then a formatter picks the separate addon for each type.

The separate objects are linked together in the package. The package can be version controlled, even with git, which allows branches. Productions can explore different ideas in branches and pull assets from others branches.

It’s too important an issue for Blender to be used more widely with other apps. Someone in the other thread said they switched to Houdini Solaris USD due to issues with Blender’s linking:

A file format is a difficult issue to solve if one person has to maintain it. Inevitably it gets abandoned or maintained less and it’s a huge project for someone else to take on. It has to be split apart into smaller parts and maintained by a large group of people.

The workflow could be something like this:

  • create mesh in .blend, save it
  • author mesh to USD as separate file
  • create rig project in .blend and use the mesh.usd as the asset, save it
  • author rig and animations to USD as separate file(s)
  • create look dev .blend, use mesh.usd as the asset, build shaders, save it
  • author shader graphs to USD (there can be translators for different endpoints)
  • create scene file, position USD objects and save scene USD

All separated workflow that at all stages would be compatible with other apps. Look dev can be done in Substance and textures/materials can be authored there. Rig/animation can be done in Maya. VFX can be done in Houdini. Sculpt can be done in ZBrush. Each app still has its own formats for the source objects but the pipeline format is shared, can be version controlled and would scale across large teams.

1 Like

I have to admit I am getting a bit lost with this conversation or what the aims are? If it is essentially about workflow and pipeline and optimising scenes and assets all I can add is what I have learned directly from my own work and career.

Although it is most typical on a larger studio production to cache out the scenes into a cached and baked file sharing format I feel that on smaller scale projects and especially if working alone this could prove to be an unnecessary layer of extra complication. Some of what makes most sense in a bigger production set up might not be so necessary or translate as well into a smaller set up. However I do think it is always important to set up a good asset and rendering pipeline.

My own background is from quite a bit of past experience in planning and overseeing animation pipelines for CG animated series and video game cut scenes as well as working as a wide ranging generalist including character and creature animation and design. This was working with 3DS Max and Maya based pipelines and also Softimage and CryEngine a bit as well. My last experience on a character animated show was as the animation and 3D technical director on a Netflix show with a Maya and Redshift based pipeline.

More recently I have been creating some quite ambitious animation and effects projects for large scale gallery exhibition using a mostly Blender based pipeline. This was mainly working in collaboration with an international gallery artist from India based in London. So far we completed three immersive projection film and soundscape set ups of about 40 min each and there are several smaller projects on the go. This has involved juggling and managing a lot of assets and files and keeping things smooth and optimised. Blenders library linking out the box has worked out well for me has been easy to understand and has so far proven robust. I love how Blender can be pipelined into itself. So ideal I think for small studio and solo projects but this is a lot I think to do with how Blender has evolved as a more or less self contained open source project.

For the Blender based projects we have been doing recently I set up a library linking system for assets and scenes exactly as would be done in a medium sized studio. This was even though I was handling all of the CGI and animation work as a solo artist. This really is how I would advise anyone to do it if you are either working alone or in a small team. I never have known a better way. It was just as important for me this was done as if I were setting it up for a small team since I needed to have a reliable pipeline in place for me to focus on the creative side. So it is necessary to wear several hats. I didn’t use scene management software for this at least not yet. Simply the scene linking tools within Blender and setting up clear file paths and naming conventions for everything.

The nice thing about Blender asset files is they can contain more than just the 3D scene data so linked assets can be stored with textures etc too which can further simplify the pipeline.

The most important thing with any animation and CGI pipeline is to keep it super clean and tidy and clear to understand. It is as important to do this if you are working alone as if you were setting it up for a small team since when you are into the thick of the creative side you don’t want to be distracted by scene set up issues. I think it really is worth the effort setting up a proper asset and scene linking pipeline since you will save so much time and stress further down the line and it will also be easier to trouble shoot and fix things easily across multiple scenes and files. This is how I have always seen the file sizes kept manageable for scenes and assets. For starting out perhaps try first with just using the library tools and functionality within Blender and using a clear file system. Perhaps if this recent work grows bigger and there is the need to take on more people we would have to switch to a software management system of some kind but up till now working alone I wanted to keep it basic and simple and not have that extra layer of complication.

In a studio the asset linking would be managed through a server, We did not have the budget for that so I used external drives with a back up system and plan. We were working in a small basement studio. It will also work if you are working on small to medium sized animation projects from home. But if you can afford to buy and run a dedicated server and back up machine then go for it.

Of the scene sharing and caching options in Blender now the only one I have direct working experience of in an actual production setting is Alembic. Right now Alembic seems to be working well in Blender at least from when I have used it. The last time was during Covid times where I was using Blender for some of the effects work on Maya scenes on an animated TV series as I often found Blender faster and more forgiving for this especially working under crazy time pressure. Dynamic paint for footprints in sand and snow etc and particle flows. I was able to reliably export whole cached multiple character scenes from out of Maya into Blender then add extra elements and effects work in Blender and export it all back into Maya. I could even produce effects masks and shadow passes this way. Why I ended up doing those in Blender is too convoluted to go into, but anyway it worked. The one thing to watch out for when going between Maya and Blender is the scale setting is different so has to be manually typed in. I can’t remember what it was now but something to watch for.

Anyway that is all I can think to add and much more than I expected to write once again. Perhaps I am wide off the mark here or outdated with some things but a lot of the conversation is making my head spin ?

4 Likes

I can only speak from my own experience but so far I haven’t seen a USD implementation that allows for such a seamless pipeline. As I said before, in my own experiments for work what I found is that each program has a slightly different implementation of the format, or at least that’s what it looks like to me, because even the demo files provided in the OpenUSD.org website will load slightly differently in different programs.

The only thing similar to what you describe that I can think of is Nvidia Omniverse, and that’s only because they developed a program on top of USD that serves as the “assembly line” where the scenes get built, but they also had to develop the specific USD importers/exporters for Maya, Blender (I think it’s a custom build), Max, Unreal, etc.
So to use Omniverse you have to stick with the programs/versions they support.

So yeah, sounds great in theory but it’s not there yet.

EDIT: Just want to make extra clear that I’m speaking only from my own understanding, and it is very likely that I’m just missing something important about the whole USD workflow (Because it’s also more complex than it seems at first).

3 Likes

From My read of the original post, the thread author
was seeking better format for sharing large Blender files with other Blender teams with more reasonable file sizes.

As far as sharing Blender scenes and Data outside the Blender ecosystem I do not really see a wide demand for that.

2 Likes

Hey thanks for the clarification. I wasn’t sure anymore.

And yes as for sharing whole Blender scenes and data outside the Blender ecosystem I would agree I cannot imagine much of a demand for that either.

Although if scene elements do need to be shared then as said Alembic worked well for me in recent years. I could accurately export most core essential animated scene elements back and forth between Maya and Blender this way. But really you are going to want to commit to one core ecosystem for building scenes in an animation project. If working between studios sharing scenes then you would expect they would use and share the same software ecosystem. Otherwise it’s just asking for so much trouble.

1 Like

There might be more reasons the conversation got confusing.

But essentially as I see it, it was simple.

The request originated out of a need to have a package bundling solution to reduce file size for versioning.

If it was mentioned in the original post, I missed it. But there did not seem to be any mention of packaging the data for use outside of the Blender ecosystem.

From that, the conversation went into a few logical directions.

First - why are your files sizes getting out of hand?

Why are you versioning scenes with a lot of local assets and large footprints?

Linking is the system currently the method of building large scenes of libraries with a small footprint for versioning.

The request was for some kind of Blender file format that would also be structured similar to how linking already works.

Why not linking?

Because it has limitations.

But you can’t talk about linking and its limitations without discussing production pipelines and understanding how to use linking within the context of limitations and most importantly without a discussion about pipeline best practices.

USD was first mentioned as a format within an addon that would combine various methods (not linking) and used to transfer data within a Blender ecosystem. As I understood the first post.

But it later spun off into its own topic.

Put simply.

Linking and libraries is the current imperfect but workable solution to the original request.

Also there are other threads that are currently discussing these same topics.

2 Likes

Maybe. But, if you consider how Unreal 5 has changed production pipelines, I might disagree.

We are currently using a lot of Unreal scenes and going back and forth between Blender and Unreal. And many times we found ourselves having to rebuild scenes in one or the other.

I am not saying I am expecting a solution to ever emerge, but I would argue there would be a huge demand for it, if a UFO were to crash that had a production studio onboard and Area 51 could backward engineer its universal production pipeline tech. :laughing:

Putting aside the off chance of alien technology emerging, some way, I imagine we will be stuck with the tried and true grunt work of import/export and scene and shader rebuilding between apps.

3 Likes

Yes, thanks for getting back. This is what I thought ?
I was finding the whole thing a bit confusing. Possibly because it might have been becoming a bit untethered from a real world production situation and into hypotheticals. But yes to reinvent something it needs to be seen and discussed in the full wider context and understanding of how and where it would be used. In this case within a studio or team pipeline and the day to day production needs and most efficient way to manage share and store data.

I have always simply had to get stuff made the best way that could be done with what we had and where we were. That’s just how I think. I know for you too of course. You are running your own studio. I think too as you pointed out. Unless somebody has had some working studio experience for a bit and been a part of the team production process. Then the day to day pipeline issues and workings can perhaps be difficult to grasp or seem vague and unclear.

I sort of skimmed through and got pulled in from the comment about wanting to know how to edit library linked rigs. I was thinking well this is just basic day to day stuff. But I guess it just does not get talked about so much. As with constraints and baking in the animation process which has always been a major part of my workflow and most other animators I worked with. I thought perhaps I could help out with something like that which is why I mentioned my background. Although we all of us will have gaps in our knowledge and especially with this rapidly moving field everything I post I try to relate from direct working experience. Either with Blender or any other app or process.

Anyway all the best for the new year everyone. Not sure why I got pulled in and posted so much but there you go. First Blender Artist post of 2025. I hope it’s of help to somebody at least. It’s made with good intent and great enthusiasm for Blender.

All the best.

As for a universal scene system as you mention with Unreal yes it would be a great thing for sure. I need to get back into real time and after what I saw in Venice recently I would love to get into immersive VR.

3 Likes

Putting aside the off chance of alien technology emerging, some way, I imagine we will be stuck with the tried and true grunt work of import/export and scene and shader rebuilding between apps.

I am not sure that is such a bad thing entirely.

Competing ecosystems tend to encourage diversity of thought and healthy competition for the consumers choice, as well as fomenting the creation of super useful, specialized, single purpose, tools to fill gaps like Embergen for smoke/fluids/pyro
or Marvelous Designer for cloth simulation.

Whatever your area of specialty&interest,
We have so many options today even compared to the 2010’s
especially in Blender with pro addons

I am not really convinced we will ever see any “universal solution” to 3D/CG production
in the traditional sense.

However when generative AI becomes controllable enough
it will introduce a whole new paradigm but that is another subject already being debated in other threads on BA.

2 Likes

Really good points.

2 Likes

Yeah. You and I are in the same page here.

I have mused about what the actual motive is behind something like USD.

I will not say it is some hidden agenda or anything like that. But I would not entirely rule it out either.

It would not be the first time that a large company with money to burn throws money at a problem, only to have it end as an experiment that just didn’t bear fruit.

In fact there is a culture of this approach. So wishful theories and benevolence can not be ruled out either.

But there are also some legal/scientific practicalities that make open source initiatives make sense when shared between companies.

For one, it levels the legal playing field so that no single company will claim rights, fight for credit or attribution, or get caught up in use license litigation etc.

And in so doing all can participate freely and put resources into the initiative that they know will be mutually beneficial to all parties. So there is an open exchange of ideas and resources without ties to private IP.

So the other aspect of this is, because it is open source, now anyone can join in.

And who is to say where the nuggets of contributions to the main code will come from?

However, while the code might benefit. And while Autodesk and Pixar might benefit from a studio pipeline tool, it is very unlikely that crafting a true interchange format will come out of if that is useful to “all”.

And as mentioned, each company is adopting their own versions for their tools.

And so here we are back to square one.

1 Like

That’s what I find, things people are doing day to day rarely get mentioned as best practises.

It’s the same with software, there are many places that show how to write in a language but it’s rare to see best practises for architecture.

What is the workflow you use when handling linking? Do you have a similar setup to the diagram posted earlier? Are you doing version control on assets?

What structure do you use in each file? It’s not clear what should be linked. When a link is added, it browses the entire .blend file. Does everything go in a collection and link the collection?

Omniverse looks good. At 2:00, it mentions the app connectors where separate apps can edit parts of a scene without import/export and it does updates in place:

Yes, version control was the main thing. Some form of linking should work.

Interoperability with other apps would be a benefit from having the pipeline split into smaller parts.

The scene layout is the biggest issue. Individual assets can be managed separately but trying to move entire scenes between apps is not easy.

With a package format, it could internally maintain a scene USD alongside blender and this can be used by other apps. Other apps can update this and there’s no explicit import/export. It’s instant, hot-reloading. It’s a huge productivity boost to be able to edit things in-place.

According to the site, it evolved from Pixar’s struggles on maintaining a scene layout across their pipeline.

https://openusd.org/release/intro.html

The speed, scalability, and universal pipeline access of TidScene pose-caches were a success, but also put Pixar back into a place where we had multiple, competing systems for creating composed scene description, with different semantics, API’s, and places in the pipeline where they could be used. The mandate for the USD project, initiated in 2012, was to marry the (recently redesigned and improved) composition engine and low-level data model from Presto with the lazy-access, time-sampled data model and lightweight scenegraph from TidScene. USD delivers an all-new scenegraph that sits on top of the very same composition engine that Presto uses, and has introduced parallel computation into all levels of the scene description and composition core.

It also allows using different renderers on the same scene. Every renderer needs a scene to draw and RenderMan uses its own rib file format, shader files, texture files etc.

This setup needs an app to be the scene manager and every app then has to conform to this. With USD, it makes this role independent, which allows it to scale more easily.

The following video is interesting and talks about Blender’s linking and USD. This mentions building a material library and linking it to the mesh file, then linking this to the scene, rig and animation would be in between.

At 30:00, it shows a large scene in Unreal loading in Blender and rendering in Cycles.

At 38:00, it mentions VariantSets. In version control, it’s possible to make branches that have different implementations and switch between them. It’s not so much version 1, 2 but A, B and each has a version Av1, Bv1 e.g horse with saddle or without saddle. Same asset, different variants. Can also have LOD versions.

How do two people edit a Blender scene and merge the changes? This should be possible to do the same way a team working on Blender development can work on multiple separate features and merge them.

On something like Barbershop animation, someone can work on environment layout, someone else on character positioning in the same master scene. If the environment artists moves the chair, the character artists has to adapt to the change but these changes don’t have to block each other, they can be separate overrides of the master scene and merge when they are synced.

1 Like

Cool!

Some things to look into.

By the way I get the reason for USD.

The reason to turn it into an Open initiative is the question, despite the reasons that might be publicly stated.

As I said, not nefarious. But I think the real reasons are not discussed. Or at best simply unrealistic.