How much hard drive space do I need to do 3d seriously?

I waste a lot of time and I mean A LOT of time trying to keep file sizes down when doing stuff with Blender.

I’m always splitting everything up into separate files and linking them together.

I use google drive for backup and my internet connection is spotty so I prefer uploading files as small as possible instead of having to upload a 100+ gibabyte file every time I hit ctrl+s. For example a few days ago I downloaded a 290mb 3DS file and spent most of the day cutting the file size down to about 9mb in Blender.

Library overrides have been getting on my last nerve lately. I don’t know how the hell to do them properly, especially when it comes to using multiple separately posed instances of the same one rigged asset.

Google Drive has been getting on my last nerve lately also.

I’m about ready to just give up and not care about file size and backups and duplicate assets anymore. However, I don’t really know how much hard drive space I really need to be so carefree with blender files and assets and addons.

I looked at a material on blenderkit just now and it was almost 200mb!

How much space hard drive space do I really need?

2 Likes

lol… I haven’t got a blend file under 500MB

besides that I back everything up on physical media / DVDs, as I don’t believe in cloud storage.

Usually I mickey mouse around until my 1TB drive starts melting, and then I back up project after project, to make some room for more crap.

I am not being of any help.
but at least you can see that there are users like myself who are a tad more generous in wasting space and being far more inefficient in data management

2 Likes

I think that really depends on what you mean by serious 3D.

You can be a very serious, high-end polygon modeller and not use all that much space at all.

On the other hand, you can be writing out cache data files for fluid simulations and be kissing goodbye to Gigabytes of storage in a day.

Or rendering an animation to multi-layered OpenEXR files and then compositing that with more elements, etc.

Then there’s stuff in-between, say texturing, which could just be a 512x512 pixel seamless texture that takes up almost no space, to a library of HDRI files using 100+GB.

So while somewhat optimising one storage is nice, I think there needs to be clear reasons for it. In the case of that 3DS file for example, if that was say a tree or building that you are going to use within that same project multiple times and likely in many other projects, time and again, then sure, spend the time to make it better.

On the other hand, if its a one off object and one time use, then as long as its size doesn’t slow everything down or prevent rendering, etc, why waste the time.

We live in an age where multi TB drives are normal (my system/data drive is a 2TB SSD, my games/temp drive is a 2TB NVMe), a 200MB file makes next to no difference.

6 Likes

Short answer: a lot. I’m currently at 8 TB across 6 drives, personally, and I’m about 25-30% full. I have one project folder that is 100+ GB just by itself. 3D projects, especially animation, need a lot of space

4 Likes

I think it mostly depends on what you’ll be doing in Blender. I have 480 GB SSD along with 1 TB HDD and I can live with that, since I’m working with low poly projects which is does not requires a lot of storage (at least in my case) - I’m fine.

1 Like

Why not? It’s a constant, up to date backup of your files. It’s saved my butt a few times already.

I don’t use it exclusively, and keep another physical backup just in case, but it’s still a nice just-in-case redundancy to have around.

3 Likes

I’ve never really seen library linking as a way to save disk storage, but as a device to minimize work effort - I hate it if fixes/upgrades need to be applied across multiple files because assets have been embedded instead of linked. Keeping shot/scene files small also keeps Blender responsive. Autosave freezing Blender every couple of minutes can become a real p*** … you know what.

I can so understand that, in their current state overrides somethimes look like a regression as much as improvement … armature proxies were such a nice and clean interface to a rigged asset, they didn’t clutter the outliner with every single bit of internal stuff never meant to be accessible during set dressing/layout/animation.
Nevertheless if I were you I’d stick to library linking - too many advantages regarding project house keeping far beyond disk space concerns.

Regarding the rest, well, it’s a matter of how much resources you can get your hand on as much as what you need, isn’t it? I remember, years ago, before I had started out doing 3D at all, I was arguing with some 3D guys regarding their (in my eyes) almost crazy ideas about memory (RAM, back then) - at that time I had never worked with anything exceeding 4GB of RAM, and filesizes beyond 100MB were almost inconceivable.

Meanwhile I’ve come to accept that literally everything is possible, any size of storage can be put to use and/or filled up, in particular since I’ve started using shots/animations augmented by simulation.

Sadly, storage demands also increase steadily because people get sloppier and sloppier around what they do, just because they can. Doing rigs for another short at the moment, and the modelling artist, once again, keeps throwing pointlessly high poly stuff around, character files always >100-200MB straight away. Meshes are obviously, cluelessly grabbed from some character generator, on top of nonsensical mesh density ladden with all kinds of additional features which don’t make the slightest artistic or technical sense in the context.

So, like @joseph said … a lot. As much as you can get your hands on. Currenty I use 10TB - which is nothing, basically I’m struggling around 90% use all the time (can’t afford any upgrades right now …). Despite using library linking as much as I can btw …

2 Likes

I know a 3D animator who is finishing up an hour long, 4K animation. He’s over 20 TB for project files. This seems to match up pretty closely to my own animation projects- I have a ten minute short in progress that will take about 2 TB, based on current size requirements and progress. Obviously, if you’re not doing animations, you need a lot less space, but I’d say 5 TB is the minimum for animators. (Note that all these space calculations include backups, so there’s some duplication in there.)

2 Likes

there is something about big corporations having your intellectual property on their servers, that I just can not get along with.

call me paranoid, but I take that stuff serious beyond ridiculous.
Not that I have anything that is top secret or of historical importance, lol

5 Likes

I can understand the concern, but so long as there’s nothing in the EULA about any information stored on their servers immediately becoming the intellectual property of the corporation in question (which would only be enforceable in the US), there’s really not much they can do.

2 Likes

Not a fan of cloud storage either, for a number of reasons. But if I was to use it, I’d want a system where I locally encrypt the data before uploading and only I had the keys to decrypt it once downloaded.

5 Likes

You have uncovered my laziness towards reading kilometer long EULAs. lol

3 Likes

Yeah, I don’t blame you. That’s why I look for other people who have. :stuck_out_tongue:

1 Like

That’s ultimately an answer you need to find out for yourself.
Personally I got a 2 TB drive to store projects and a 1 TB hard drive to store everything else, including textures and assets that I reuse, but I don’t have a lot of them.

All my assets to be reused (textures, objects) : 145 Gb.

For animation there is no fixed rules, but unless you do a lot of simulation project files aren’t that big.
On my B&P project we had 45Gb for all the production files ( textures, shots, assets). And that includes many version per shots.
Everything is linked into the shots, so a shot file is generally less than 1Mb. But there could be like 40 version for some of them.

In animation what takes the most space is obviously renders, when you start to do multiple versions, and export different passes you need a lot of space.

I don’t recall exactly, but I think we managed to do everything within 2TB. But we didn’t do any renderlayers/passes. And we end up exporting ~55mn of animation.

But if you think about it, if we had two projects in parallel that would meant 4GB already.

You also need to account for backup, on a pro environment you need 3 backup system to prevent any issue.
On my personal computer I got 1 drive of 1TB and it works.

If space is really an issue, you can save some with organisation, if you remove all the .blend1 you already save half of the space.
Keeping only the necessary files when archiving your project will also do a lot.
You can probably remove all the .psd, .kra, .xcf files and keep only png files for textures.

For animation we remove intermediate renders and keep only the deliverables.

we can also keep only the last version of each shot and assets. When doing that the final archive will be very lightweight.

Obviously you’ll have to adapt that to your case, but maybe take a few project, inspect the file size, see if you can save some space during archiving, and calculate how many of them you do over a few mounths. That should give you an idea of how many disk space you consume per year !

2 Likes

What software do you use to sync your physical backups?

There have been rumors of certain specific pdf files and video files on google drive and google photos being disappeared without user consent.

When I’m on Linux, I use Pika Backup. Now that I’m in Windows again for a bit, I just use File History.

1 Like

How well does that actually work? I’ve used macos and windows equally for like 20 years but I gave up on using any of Windows’ “features” like 17 years ago.

I’ve never had a problem out of it. Like all things Microsoft, it doesn’t exactly have the most elegant and intuitive user interface in the world, but it works well enough.

3 Likes

I keep my file sizes down through things such as proceduralism and by not using insanely large textures (which leads me to believe if many here are applying high resolution sculpts and UDIM textures with more than 100 tiles to everything, because my files have rarely gotten over 1 gigabyte in size unless fluid was involved). The results I get from the files are not PS2 style toon renders either. A hint, microdisplacement takes you a long way.

Sadly, it seems most of the big software vendors believe platters and SSD’s can just be bought in bulk at Dollar Tree and as of now will choose not to give a crap on size efficiency. Hardware is getting better, but it has become all too common to embrace the idea of the average application being a bug-ridden and memory-inefficient resource hog that dev. teams are unwilling to do anything about. I do not know if this is to force continual upgrades that keep the riff raff (ie hobbyists) out, but I am convinced we are seeing an interesting paradox where technology just keeps getting worse while simultaneously allowing you to do more.

Case in point, the big game engines, a lot of people on the Unity forums have come to accept that you will need a minimum of 100 gigabytes of space per project because of cache and whatnot (but check out the long list of fancy pants bells and whistles that come with requiring the best storage money can buy, so it can’t be all bad right)?

1 Like