Is there a way to download a publicly-viewable Google Drive url via curl or wget? For example, being able to do something like:
curl -O myfile.xls https://drive.google.com/uc?export=download&id=1Wb2NfKTQr_dLoFJH0GfM0cx-t4r07IVl
Note, I'm looking to do this on a publicly-viewable file without having to sign into my Google account (or have someone else sign into their account, etc.).
If helpful, the cors headers I have are:
"kind": "drive#file",
"id": "1Wb2NfKTQr_dLoFJH0GfM0cx-t4r07IVl",
How about this method? When the file is such large size, Google returns a code for downloading the file. You can download the file using the code. When such large file is downloaded using curl, you can see the code as follows.
<a id="uc-download-link" class="goog-inline-block jfk-button jfk-button-action" href="/uc?export=download&confirm=ABCD&id=### file ID ###">download</a>
The query with confirm=ABCD
is important for downloading the file. This code is also included in the cookie. At the cookie, you can see it as follows.
#HttpOnly_.drive.google.com TRUE /uc TRUE ##### download_warning_##### ABCD
In this case, "ABCD" is the code. In order to retrieve the code from the cookie and download the file, you can use the following script.
Sample script :
#!/bin/bash
fileid="### file id ###"
filename="MyFile.csv"
curl -c ./cookie -s -L "https://drive.google.com/uc?export=download&id=${fileid}" > /dev/null
curl -Lb ./cookie "https://drive.google.com/uc?export=download&confirm=`awk '/download/ {print $NF}' ./cookie`&id=${fileid}" -o ${filename}
If this was not useful for you, I'm sorry.
You need to use the -L
switch to make curl follow redirects, and the correct switch for the filename is -o
. You should also quote the URL:
curl -L -o myfile.xls "https://drive.google.com/uc?export=download&id=0B4fk8L6brI_eX1U5Ui1Lb1FpVG8"
Simplest and best way (with a real Google Drive file example)
Install gdown using pip
- Command -
pip install gdown
Let's say I wish to download cnn_stories.tgz from Google Drive
- Download Link:
https://drive.google.com/uc?export=download&id=0BwmD_VLjROrfTHk4NFg2SndKcjQ
Please note the id
URL parameter 0BwmD_VLjROrfTHk4NFg2SndKcjQ
in the link
That's it! Download the file using gdown
gdown --id 0BwmD_VLjROrfTHk4NFg2SndKcjQ --output cnn_stories.tgz
TLDR: gdown --id {gdrive_file_id} --output {file_name}
Command Line Args:
--id : Google drive file ID
--output: Output File name
I've just checked the answer of @tanaike and it works like a charm. But the solution, proposed by @Martin Broadhurst and accepted by the topicstarter doesn't.
Because google shows the warning about virus scan, which needs to be processed, soc script required.
I'd like to vote for the @tanaike's answer, but don't have enough reputation to do this :)
Additionally for those, who don't know how to get and ID of a file on google drive I'd like to share this pretty simple knowledge.
- go to your gDrive
- right click on it and select "Share option"
- choose public share for all, who have a link with no login required
copy the URL
https://drive.google.com/file/d/1FNUZiDDDDDDSSSSSSAAAAAdv42Qgzb6n8d/view?usp=sharing
paste it to some notepad
- ID is a part of URL:
1FNUZiDDDDDDSSSSSSAAAAAdv42Qgzb6n8d
Enjoy!
Simply
wget --no-check-certificate -r 'https://docs.google.com/uc?export=download&id=FILEID' -O DESTINEATION_FILENAME
source
As of 18 Nov 2019 to use wget to download a file from Google Drive, I used the following method. For this method, we need to know, whether our file size comes under small or large cat. I could not figure out the exact number which differentiate between small and large sizes, but I suppose it is some where around 100 MB. But you can always use any of the two methods mentioned for your files, as one will only work for small and other for large.
Basic Steps to be followed
Step 1 Make your file to be shared as "Accessible to anyone having internet". This can be done by Right Clicking the file --> Click on Share option --> Click on Advances radio button --> Change access to "Public On the web"
Step 2 Save it and click Done
Step 3 Again right click on the file and click on "Get Shareable Link". This will copy the link to clipboard.
Step 4 Copy everything after ?id= till end and save it to notepad file. This is your FILE_ID, which is used below.
Step 4 Follow below mentioned steps, based on file sizes, after performing above common steps.
Small Files
Step 1 Use the command:
wget --no-check-certificate 'https://docs.google.com/uc?export=download&id=FILE_ID' -O FILE_NAME_ALONG_WITH_SUFFIX
FILE_ID should be copied from above step and FILE_NAME_ALONG_WITH_SUFFIX is the name of the file you want it to save it on your system/ server. Note that, do not forget to add the suffix like (.zip , .txt etc)
Step 2 Run the command. It may show "Will not apply HSTS" as error but its ok. Your file will be copied.
Large Files
Step 1 Use the command
wget --no-check-certificate --load-cookies /tmp/cookies.txt "https://docs.google.com/uc?export=download&confirm=$(wget --quiet --save-cookies /tmp/cookies.txt --keep-session-cookies --no-check-certificate 'https://docs.google.com/uc?export=download&id=FILE_ID' -O- | sed -rn 's/.*confirm=([0-9A-Za-z_]+).*/\1\n/p')&id=FILE_ID" -O FILE_NAME_ALONG_WITH_SUFFIX && rm -rf /tmp/cookies.txt
Change FILE_ID in 2 locations and FILE_NAME_WITH_SUFFIX once.
Step 2 Execute the command, it may give same error as mentioned above but thats ok.
Hope it helps..
curl gdrive.sh | bash -s 0B4fk8L6brI_eX1U5Ui1Lb1FpVG8
0B4fk8L6brI_eX1U5Ui1Lb1FpVG8 is file id.