AccessDeniedException: 403 when trying to copy fil

2019-05-22 01:49发布

I’m trying to manually do the steps I need to automate the process to understand how it works and to make sure I get all the commands straight. But when I try to do it using the command:

gsutil cp file_name gs://bucket_name/

I get the following error:

AccessDeniedException: 403 Insufficient OAuth2 scope to perform this operation.

It was supposed to be a very simple thing, but I can't get it right. I'm used to do it in AWS, but I'm not being able to do the same in Google Cloud. Anyone knows how to get over it?

5条回答
来,给爷笑一个
2楼-- · 2019-05-22 02:19

Hey @Marcus Vinicius Melo,

I was the project owner and still facing the same issue. After I install gcs python package, it solved. Try pip install google-cloud-storage

查看更多
啃猪蹄的小仙女
3楼-- · 2019-05-22 02:25

This error has nothing to do with read/write permissions, but with scope assigned to your compute instance. In other words, you need to grant it right to use GCP resources at all, read/write permissions for particular object is what you do next.

Previously it was necessary to recreate instance, but now it is enough to stop and then run below command, for example using cloud shell:

 gcloud compute instances set-service-account myinstance \
     --service-account 192893587797-compute@developer.gserviceaccount.com \
     --scopes cloud-platform

Of course replace myinstance and 192893587797-compute@developer.gserviceaccount.com with your own compute instance and service account name accordingly.

查看更多
疯言疯语
4楼-- · 2019-05-22 02:27

I talked to a friend of mine and he helped me solve this, and here is the solution:

My GCE VM has read only permissions for the Google Cloud Storage. I found in web that you can only change this status when you create the GCE VM, and that in order to overcome this I would need to create a new one and delete this VM. That would work, but I didn't want to go loose everything I've done in my current VM.

The other solution that worked for me was to create a service account (Google Cloud Platform > IAM & admin > Service Accounts). You just need to give it the service account name you want, and select 'Furnish a new private key'. Then you should be good to go. After that I could copy files from GCE VM to Storage and do everything else I needed.

查看更多
何必那么认真
5楼-- · 2019-05-22 02:39

Hey @Marcus Vinicius Melo you just need to enable Storage Transfer API in GCP then gsutil for a GCE to bucket will be done.

查看更多
Ridiculous、
6楼-- · 2019-05-22 02:43

It looks like the account you're using to attempt this copy doesn't have permission to write an object to the bucket-name bucket.

If you're doing this on a GCE VM and using its default service account, make sure that you selected the correct access scopes when creating the VM -- the default scopes include read-only access to GCS. You can check this by logging into the VM and using curl to query the GCE metadata server:

$ curl -H 'Metadata-Flavor: Google' "http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/scopes"
[...]
https://www.googleapis.com/auth/devstorage.read_only
[...]
查看更多
登录 后发表回答