Wget download list of URL and set different file p

2019-04-12 18:17发布

问题:

I read here http://www.thegeekstuff.com/2009/09/the-ultimate-wget-download-guide-with-15-awesome-examples/ that if I want to download multiple URL's, I can save them in a text file like this

$ cat > download-file-list.txt
URL1
URL2
URL3
URL4

And use wget like this

$ wget -i download-file-list.txt

However suppose I want each URL to be saved in it's own directory on my drive, like this:

URL1 -> Users/Downloads/Tech
URL2 -> Users/Downloads/Fashion
URL3 -> Users/Downloads/Cooking
URL4 -> Users/Downloads/News

How do I accomplish this? Is it possible to manually set the directory for each URL in the text file and have wget read that and automatically know where to save each file? Is there any other methods to achieve what I need? I'm trying to setup and automated downloaded process using cronjobs later on.

Thanks

回答1:

Then you can't use that method. The best way then would be a bash, perl, or python script that reads in a file in some format (maybe "URL directory") and downloads each url to the specified directory. But you'll need to put some smarts outside of wget to get that behavior.



回答2:

Here is a batch script that can do this, providing such a list :

$ cat > urllist.txt
URL1 file1
URL2 file2
URL3 file3
URL4 file4

$ while read url file; do 
    wget -c -O "$file" "$url"
done < urllist.txt

There should not be any space in URLs (they can be represented by %20) however, there can be spaces in the file name.



标签: wget