How to resume an ftp download at any point? (shell

2019-02-03 18:49发布

问题:

i want to download a huge file from an ftp server in chunks of 50-100MB each. At each point, i want to be able to set the "starting" point and the length of the chunk i want. i wont have the "previous" chunks saved locally (ie i cant ask the program to "resume" the downlaod).

what is the best way of going about that? i use wget mostly, but would something else be better?


hi there! i'm really interested in a prebuilt/inbuild function rather than using a library for this purpose... since wget/ftp (also, i think) allow resumption of downloads, i dont see if that would be problem... (i cant figure out from all the options though!)


hi noinfection - i had a look at that and that wouldnt work... i dont want to keep the entire huge file at my end, just process it in chunks... fyi all - i'm having a look at continue FTP download afther reconnect which seems interesting..

回答1:

I'd recommend interfacing with libcurl from the language of your choice.



回答2:

Use wget with:

-c option

Extracted from man pages:

-c / --continue

Continue getting a partially-downloaded file. This is useful when you want to finish up a download started by a previous instance of Wget, or by another program. For instance:

               wget -c ftp://sunsite.doc.ic.ac.uk/ls-lR.Z

If there is a file named ls-lR.Z in the current directory, Wget will assume that it is the first portion of the remote file, and will ask the server to continue the retrieval from an offset equal to the length of the local file.



回答3:

For those who'd like to use command-line curl, here goes:

curl -u user:passwd -C - -o <partial_downloaded_file> ftp://<ftp_path>

(leave out -u user:pass for anonymous access)



标签: shell ftp wget