They’re looking for more information on that bug report, you should add this as a comment there
Make a new blend file and append that scene, so you can bring one collection or object at a time to debug it and see what collection or object is the culprit
Just for reference, this is something that can be done in the CAD software I use. You can fully edit parts from an assembly file. So it is possible to do.
I think it would be a very useful feature in Blender if it could do that.
What improvements can you have that will work with a 3 GB file? In Blender, that’s absurdly large. You must have hundreds of millions of polygons and dozens of 4k+ textures in there. I don’t think the onus is on the developers here. Blender isn’t designed to handle gargantuan files, that’s what Clarisse is for
You don’t need hundreds of millions of polygons to get to that number, when ppl can download HDRIs that are almost 3GB and textures that are over 1 GB (some ppl don’t care about optimization)
I’ve got a 2.5GB blend exported out of a CAD program and it’s really slow in Blender but extremely fast in the original program. It’s purely objects and some basic materials, just tons of instances. It’s not even that many polygons total(3.2 million face) but 600,000 objects.
There are things that Blender does that are incredibly unoptimized for huge scenes that just need to be found and fixed. They’re getting there though. It used to take 30 minutes to open this file in 2.8 and it’s down to about 5 in 3.2. It’s currently like 8-10 seconds to select things and maybe 5 for viewport changes. UI changes like in the Preferences is 5-10 seconds to react, even just to expand a collapsed item.
I’m working on building the latest master to try out that Outliner fix to see if that helps with speed on this file.
But textures are not in the Blend file. Except if you pack them in, of course, but that’s the exception.
There are some structural issues within blender that are difficult to just “find and fix”. particularly with large numbers of objects.
It’s not a bug that a toyota corolla is slower than a ferrari.
Blender isn’t going to ever be the uber tool that is as fluid as zbrush, as performant as houdini, and as capable of complex assemblies as solidworks. it’s just blender, and that’s fine.
I don’t get mad at photoshop for not being illustrator. I don’t get mad at autocad for not being sketchup. every tool will have it strengths and limitations, and working within them is the job of the artist.
Yes, it would be lovely if I could purchase huge files made in other software and load in a billion polygons and gigatexels of data and have it be smooth as silk. It would also be lovely if my alarm clock cooked me breakfast and made the bed when I got up. but it’s just an alarm clock. it does it’s job, and expecting it to do more than it is capable of is a failure of my expectations, not my alarm clock.
Point is you need a work around. And here Blender could benefit from real proxies. For Example, I once had a huge city, with tons of textures. Same for the cars and trees in the scene. I exported it as proxy scene that had 3GB and from that point I focused on the animation in the foreground. My scene was fast and only on render time the background needed to be loaded, which was also zero problem and fast.
That is the reason why people in Blender use Redshift or Octane, etc. Its what real production can demand. Blender has currently no way to load objects just at render time. Its doing a goos job on linked scenes and objects, but not as good as it could.
Hope that this gets on the list
Lots of exaggeration in your post, not sure if I should be taking it seriously or not but it’s all code, it can be fixed but whether it’s worth it(time and return on the effort) are the decisions the Blender devs have to make.
With as old as Blender is, I am certain there are O(n^2) and O(log n) implementations that were chosen for the low numbers of objects and ease of development in the past and can be adjusted to quicker algorithms. Every version of Blender has gotten better with this problem, the devs are aware of it and I think it will continue to improve in the future. I also don’t think it’s all structural because Blender can load a 2.5GB file in 5 minutes. That tells me that the data structures are pretty well organized, it’s just the algorithms being used to evaluate them need some work.
Well, I can now collapse and expand a collection with 660,000 objects in it very quickly but no apparent effect on viewport response or the other UI issues.
There’s probably another performance gain to be had in the alt-clicking of the various show/hide/select icons on 660,000 selected objects in the Outliner. It ran for like an hour before I killed it.
These are the kind of structural limitations I was speaking of, not literal file structures. Those kinds of algorithmic structures are pretty difficult to just swap out with better code. Possible, yes. but as you said, how many man-hours do we have available to throw at refactoring core code?
Baked smoke in vdb format can be pretty heavy. I have one scene where one frame of smoke is over 300MB.
Multiresolution modifier can increase file sizes a lot too.
When I worked for a company that do 3D printing (I’m talking figurines for AAA games) we tried to use Blender to boolean all meshes of a model into giant one mesh for printing. Detailed model we got from game studio (highpoly version made in Zbrush) imported into Blender ended with blend file over 10GB. Final product for print was much more optimized, but we needed a lot horsepower to properly do mesh optimization, boolean and cleaning. It wasn’t easy to do this work in Blender.
3G is large?
it has to be done sooner or later.
Even regular users nowadays tend to have more and more objects in their scenes.
Sure, but not absurdly large.
And it is being done, it just takes time. In the interim, optimizing your scene will take a lot less time than waiting for the devs, and it will be much more effective than whining about how blender can’t do things that other programs with 10x the funding can do.
There are circumstances where GB+ file sizes are unavoidable, sure, but I don’t feel like they’re as common as this thread makes them seem. My non-archived Blender projects folder has 212 files in it, for a grand total of… 12.72 GB. I have files with dozens of 4K textures and multiple, hi-poly, rigged characters that are 15 MB each. My largest file is 78 MB -with an extremely complicated shading setup that uses 20+ 4K image textures and literal thousands of nodes, and dozens of unique hi-poly objects.
I would have to go out of my way to hit a gigabyte. Sure, I optimize my files pretty heavily, etc, but still…
I get that everyone’s use cases are different, and I don’t mean this as any kind of criticism or “no, you’re wrong”- I just don’t understand what the average Blender user is doing to get a file size of 3 GB
Also, that’s not just me- looking at the official Blender demo files (which are extremely complex, generally), they’re all about 20-60 MB as far as I know. Mr Elephant is 66 MB. Classroom is 67 MB. Barcelona Pavillion is 23 MB, and so on. And those are three of the larger, more complex, demo files out there.
3Gb of mesh data is quite large,
Of course, 3Gb of movie data is regular, 3Gb of .exr textures, or fluid-vdb bake is normal too.
At least for blender, a 3Gb file containing only mesh data is huge.
That’s logical, blender isn’t a CAD software. I had similar issue when importing models from Catia.
Maybe 3ds can handle that, but try other DCC, like C4D, probably Maya too, they always struggle with these imports.
I’m not versed in CAD, but I think the fact that they deals internally with NURBS and not regular geometry cause a lot of issues when you convert these into polygons . CAD softwares are made to manage instance and complex meshes. They also have very different toolset than movie/game oriented DCC.
Blender 2.8 had several issues with edit mode, and it was slower than 2.79, and it took a tremendous amount of work to bring it back to something good.
To me even if every additions is great in itself, optimization are always good to have… But I can’t see why object mode performance should be a priority over all the other improvements in blender ?
I can claim that having a shading workflow similar to substance is more important, or more work on everything nodes like particles, loops, sims, or a better compositor , cycles and eevee are lacking important features too…
There are always important stuff missing for some people. How do you define a priority here ?
This thread was brought back from the dead because eobet bought a unoptimised scene on turbosquid. Maybe it’s worth testing that scene on different DCC and see how they compare.
And on top of that with some optimization and scene cleanup it’s possible to bring it back to an usable state.
My scene averages a measly 6.2 triangles per object. I firmly believe there are just unoptimized paths having to do with instancing and object management and Blender will be able to handle it soon enough.