Download Files from a Website with Python [closed]

2019-09-19 15:01发布

问题:

I have about 300 small files that I need to download from a website. All are located in one directory. The files are of different sizes and have different extensions. I don't want to type each one into my web browser and then click 'save as' and such. I want to give my list to python and have it download and save each file in a directory. If python can simply download the directory, that would be even better.

回答1:

This is all detailed here. I would favor using Requests as it's generally great, but urllib2 is in the standard library so doesn't require the installation of a new package.



回答2:

If you're on python 3.3, you're looking for urllib:

import urllib.request
url = r"https://www.google.com/images/srpr/logo4w.png"
opener = urllib.request.urlopen(url)
file_out = open("my_image.png", "wb")
file_out.write(opener.readall())
file_out.close()

You should now have a file in your working directory called "my_image.png"