I'm having a hard time copying files over to my Google Compute Engine. I am using an Ubuntu server on Google Compute Engine.
I'm doing this from my OS X terminal and I am already authorized using gcloud
.
local:$ gcloud compute copy-files /Users/Bryan/Documents/Websites/gce/index.php example-instance:/var/www/html --zone us-central1-a
Warning: Permanently added '<IP>' (RSA) to the list of known hosts.
scp: /var/www/html/index.php: Permission denied
ERROR: (gcloud.compute.copy-files) [/usr/bin/scp] exited with return code [1].
insert root@
before the instance name:
local:$ gcloud compute copy-files /Users/Bryan/Documents/Websites/gce/index.php root@example-instance:/var/www/html --zone us-central1-a
The reason this doesn't work is that your username does not have permissions on the GCE VM instance and so cannot write to /var/www/html/
.
Note that since this question is about Google Compute Engine VMs, you cannot SSH directly to a VM as root
, nor can you copy files directly as root
, for the same reason: gcloud compute copy-files
uses scp
which relies on ssh
for authentication.
Possible solutions:
(also suggested by Faizan in the comments) this solution will require two steps every time
use gcloud compute copy-files
to transfer files/directories where your user can write to, e.g., /tmp
or /home/$USER
login to the GCE VM via gcloud compute ssh
or via the SSH button on the console and copy using sudo
to get proper permissions:
# note: sample command; adjust paths appropriately
sudo cp -r $HOME/html/* /var/www/html
this solution is one step with some prior prep work:
one-time setup: give your username write access to /var/www/html
directly; this can be done in several ways; here's one approach:
# make the HTML directory owned by current user, recursively
sudo chown -R $USER /var/www/html
now you can run the copy in one step:
gcloud compute copy-files /Users/Bryan/Documents/Websites/gce/index.php example-instance:/var/www/html --zone us-central1-a
I use a bash script to copy from my local machine to writable directory on the remote GCE machine; then using ssh move the files.
SRC="/cygdrive/d/mysourcedir"
TEMP="~/incoming"
DEST="/var/my-disk1/my/target/dir"
You also need to set GCE_USER and GCE_INSTANCE
echo "=== Pushing data from $SRC to $DEST in two simple steps"
echo "=== 1) Copy to a writable temp directoy in user home"
gcloud compute copy-files "$SRC"/*.* "${GCE_USER}@${GCE_INSTANCE}:$TEMP"
echo "=== 2) Move with 'sudo' to destination"
gcloud compute ssh ${GCE_USER}@${GCE_INSTANCE} --command "sudo mv $TEMP/*.* $DEST"
In my case I don't want to chown the target dir as this causes other problems with other scripts ...
UPDATE
gcloud compute copy-files
is deprecated.
Use instead:
$ gcloud compute scp example-instance:~/REMOTE-DIR ~/LOCAL-DIR \ --zone us-central1-a
More info:
https://cloud.google.com/sdk/gcloud/reference/compute/scp
I had the same problem and didn't get it to work using the methods suggested in the other answers. What finally worked was to explicitly send in my "user" when copying the file as indicated in the official documentation. The important part being the "USER@" in
gcloud compute scp [[USER@]INSTANCE:]SRC [[[USER@]INSTANCE:]SRC …] [[USER@]INSTANCE:]DEST
In my case I could initially transfer files by typing:
gcloud compute scp instance_name:~/file_to_copy /local_dir
but after I got the permission denied I got it working by instead typing:
gcloud compute scp my_user_name@instance_name:~/file_to_copy /local_dir
where the username in my case was the one I was logged in to Google Cloud with.
This worked for me:
gcloud compute scp 'username'@'instance_name':~/source_dir
/home/'user_name'/destination_dir --recurse
Syntax: gcloud compute scp 'SOURCE' 'DESTINATION'
NOTE: run it without root