I'm trying to download some public data files. I screenscrape to get the links to the files, which all look something like this:
ftp://ftp.cdc.gov/pub/Health_Statistics/NCHS/nhanes/2001-2002/L28POC_B.xpt
I can't find any documentation on the Requests library website.1
Thanks in advance!
Use urllib2. For more specifics, check out this example from doc.python.org:
Here's a snippet from the tutorial that may help
urllib2.urlopen
handles ftp links.Try using the wget library for python. You can find the documentation for it here.
urlretrieve is not work for me, and the official document said that They might become deprecated at some point in the future.
As several folks have noted, requests doesn't support FTP but Python has other libraries that do. If you want to keep using the requests library, there is a requests-ftp package that adds FTP capability to requests. I've used this library a little and it does work. The docs are full of warnings about code quality though. As of 0.2.0 the docs say "This library was cowboyed together in about 4 hours of total work, has no tests, and relies on a few ugly hacks".