I read here http://www.thegeekstuff.com/2009/09/the-ultimate-wget-download-guide-with-15-awesome-examples/ that if I want to download multiple URL's, I can save them in a text file like this
$ cat > download-file-list.txt
URL1
URL2
URL3
URL4
And use wget like this
$ wget -i download-file-list.txt
However suppose I want each URL to be saved in it's own directory on my drive, like this:
URL1 -> Users/Downloads/Tech
URL2 -> Users/Downloads/Fashion
URL3 -> Users/Downloads/Cooking
URL4 -> Users/Downloads/News
How do I accomplish this? Is it possible to manually set the directory for each URL in the text file and have wget read that and automatically know where to save each file? Is there any other methods to achieve what I need? I'm trying to setup and automated downloaded process using cronjobs later on.
Thanks