upload a directory to s3 with boto

2019-04-07 07:04发布

问题:

I am already connected to the instance and I want to upload the files that are generated from my python script directly to S3. I have tried this:

import boto
s3 = boto.connect_s3()
bucket = s3.get_bucket('alexandrabucket')
from boto.s3.key import Key
key = bucket.new_key('s0').set_contents_from_string('some content')

but this is rather creating a new file s0 with the context "same content" while I want to upload the directory s0 to mybucket.

I had a look also to s3put but I didn't manage to get what I want.

回答1:

There is nothing in the boto library itself that would allow you to upload an entire directory. You could write your own code to traverse the directory using os.walk or similar and to upload each individual file using boto.

There is a command line utility in boto called s3put that could handle this or you could use the AWS CLI tool which has a lot of features that allow you to upload entire directories or even sync the S3 bucket with a local directory or vice-versa.



回答2:

The following function can be used to upload directory to s3 via boto.

    def uploadDirectory(path,bucketname):
        for root,dirs,files in os.walk(path):
            for file in files:
                s3C.upload_file(os.path.join(root,file),bucketname,file)

Provide a path to the directory and bucket name as the inputs. The files are placed directly into the bucket. Alter the last variable of the upload_file() function to place them in "directories".