We have a bushy tree in a bucket on Amazon S3 with a large number of files. I just discovered that while some files have two permissions entries, as seen if one clicks on a file in the AWS Management Console, then properties -> permissions, one line being "everyone" and the other some specific user, other files just have one entry for that user. As the results, we're having issues downloading those files to Amazon EC2 instances using boto or curl.
What I need to do is go over all files in the bucket and inspect them. I know how to get the full list of keys for a prefix. Can I use boto to extract permissions for a key, and is there a standard way of testing if those permissions are for everyone or someone specific, and what they are?
Also, once I determine if a key has restrictive permissions, can I programmatically change them by adding read permissions to "everyone"?
Thx
Here is some Python code, using boto, that would look through all of the keys in a bucket. If the key does not allow "everyone" to read the contents of the key, it will add public-read
permissions to that key:
import boto
all_users = 'http://acs.amazonaws.com/groups/global/AllUsers'
conn = boto.connect_s3()
bucket = conn.get_bucket('mybucket')
for key in bucket:
readable = False
acl = key.get_acl()
for grant in acl.acl.grants:
if grant.permission == 'READ':
if grant.uri == all_users:
readable = True
if not readable:
key.make_public()
This code has not been thoroughly tested so you should try things out first. Also, be clear that the net result of this is to make ALL of the objects in the bucket readable by anyone. Also keep in mind that this script is fetching the current ACL of every object in the bucket so if there are millions of objects, that's millions of requests which can take a lot of time and has some cost associated with it.
Another approach would be to just call make_public()
on every key in the bucket, regardless of it's current ACL.