I want to download an image accessible from this link: https://www.python.org/static/apple-touch-icon-144x144-precomposed.png
into my local system. Now, I'm aware that the curl
command can be used to download remote files through the terminal. So, I entered the following in my terminal in order to download the image into my local system:
curl https://www.python.org/static/apple-touch-icon-144x144-precomposed.png
However, this doesn't seem to work, so obviously there is some other way to download images from the Internet using curl
. What is the correct way to download images using this command?
For ones who got
permission denied
for saving operation, here is the command that worked for me:curl
without any options will perform a GET request. It will simply return the data from the URI specified. Not retrieve the file itself to your local machine.When you do,
You will receive binary data:
In order to save this, you can use:
to store that raw image data inside of a file.
An easier way though, is just to use
wget
.For those who don't have nor want to install wget,
curl -O
(capital "o", not a zero) will do the same thing aswget
. E.g. my old netbook doesn't have wget, and is a 2.68 MB install that I don't need.If you want to keep the original name — use uppercase -O
curl -O https://www.python.org/static/apple-touch-icon-144x144-precomposed.png
If you want to save remote file with a different name — use lowercase -o
curl -o myPic.png https://www.python.org/static/apple-touch-icon-144x144-precomposed.png
Create a new file called files.txt and paste the URLs one per line. Then run the following command.
source: https://www.abeautifulsite.net/downloading-a-list-of-urls-automatically