Easiest way to grab the ED production files in one go?

So I would like to grab all the production files for ED, so I can study them, and, well they’re hosted in a folder format, which requires a downloader that will automatically grab anything within a specific folder, and I can’t really find the appropriate tool to do that. I thought Flashgot would do it, but turns out not to be so. Anybody know the easiest way to do this on linux?

I wouldn’t know about all things on Linux, but if you’re using firefox, there’s a free add-on called ‘downloadthemall’ that works excellently. :]

I think the “easiest” way to be to buy a copy of either the ED DVD or Tony Mullen’s book and just copy the files off of the disc.

Yeh I’ve got DownThemAll, though I can’t quite work out how you download subdirectories. It seems to require a renaming mask, but I don’t see how to apply it really. I just get the folder saved as an html file. :frowning:

It would be nice to buy the dvd, but I am currently out of the money. Actually I owe money. If I had the cash, I think I’d rather pre-order the peach dvd anyway.

What he said.

The “Free Download Manager” will add an option to your browser that will let you download everything on the page.

It’s a great program btw: http://www.freedownloadmanager.org/

Get .tar.gz files.

Elephants Dream is not in the Blender e-shop anymore it seems, so if I bought it from somewhere, it wouldn’t be helping the Blender Foundation right?

AI think I see what kidb means by the tar.gz files - They are in the same location and have the same name as all the folders, so they must be the folders in a compressed format. Didn’t see that! That should be easy to grab now then.

Edit: yep, the tar.gz files are simply the folders in a compressed format. There are a couple of folders that don’t have tar.gz files, but those can be done by hand.

-> for folder actions get actions.tar.gz
-> for folder ducks get ducks.tar.gz
-> for folder scripts get scripts.tar.gz

Use the add-on called Scrapbook for Firefox, I use it to capture entire websites. You can make it capture anything you want, from jpeg,avi, zip and so on, just select how many links deep you want to go.

I don’t know what OS you’re using, but regardless… the small, command line application “wget” exists for just about every platform you want to use. I comes default on Linux (and I think OS X), but you’ll need to download it for windows. Then, once you do, open up Terminal, DOS, whatever, and type this:

wget -x -r --level=15 http://URL_to_ED_Production_Files

I don’t know how deep the directories of the production files are (or what the URL to the top directory is), but just replace the 15 with however many levels you need/want. Then of course, replace the url with the real URL and hit enter.

Whatever directory your command line is when you start wget, is where it’s going to stick all the files, so make sure to cd to whatever directory you want to store them in before you start. I haven’t used wget for a while, so I think that syntax should do the job. If not, post again.

Hey guys, I think that getting the tar.gz files and then unpacking them is the easiest way, as the content is actually duplicated in the main folders. I.e for every folder, there is a tar.gz in the same location. And each tar.gz, when unpacked, will give you exactly what’s in the folder. Some folders, such as the textures folder, have subfolders each with their own tarballs.

If I did a wget, it would work I believe, but it would get everything twice. This is something to take note of if you are planning on downloading the production files yourself. For example the machine textures tarball is 1.7 gb in size. Definitely not something you want to download twice!



I will move the tar.gz files to new folders DVD1_packed and DVD2_packed keeping the structure of the DVD, so please look in the subfolders.
I will split the textures tarball. (cat file.aa file.ab file.ac … > file.tar.gz)
Due to the success of Elephants Dream there is a limit of 5 connections per IP.

Start here: http://video.blendertestbuilds.de/download.blender.org/ED/

wget has a --reject option

wget -m -np -R .gz    http://myprecious

would recursively mirror the location WITHOUT the tar.gz’s

wget also has a --accept option

wget -m -np -A .gz    http://myprecious

would recursively mirror ONLY the tar.gz’s in the location

get to know your wget options:

wget --help > wget_help.txt

(works on windows, too)

you can also make a list of url’s you want to download, ie:



and then just use:

wget -i download.txt

and admire the goods that internet sends your way …

Btw. if for any reason you feel vigilant-ic, theoretically you could DOS the abovementioned site, but that would be a nefarious practice (strongly advised NOT to be performed), and so it won’t get covered in this brief tutorial.

Oooh, wget is snazzy. Quite clever that. Maybe a bit too complicated for your everyday average Joe internet user though heh.

Quite clever that.
Um … yeah. Reading and internalizing man wget saves you shitloads of instalations of xyzwbhjfsdz plugins programs …

In fact, what knowing wget does to a person is to liberate him/her from being just another internet slave to a free, clear-thinking, unburdened inhabitant of the cyberspace.

I think the “easiest” way to be to buy a copy of either the ED DVD or Tony Mullen’s book and just copy the files off of the disc.

Believe me, I would love for this to be the case. Unfortunately, it seems that the ED DVD is no longer available at the E-Shop and, I hate to say it, but some of the ED files on the book DVD are corrupt and nobody’s quite sure why. I suspect that there is some native limit on the size of a single file on a data DVD (the ED files are archived in two separate rar files on the DVD, both pretty huge for single files, and at least one of them has problems).

So, yeah, I’d go with wget.