How to upload a file to S3 without creating a temp

2020-05-14 16:35发布

问题:

Is there any feasible way to upload a file which is generated dynamically to amazon s3 directly without first create a local file and then upload to the s3 server? I use python. Thanks

回答1:

Here is an example downloading an image (using requests library) and uploading it to s3, without writing to a local file:

import boto
from boto.s3.key import Key
import requests

#setup the bucket
c = boto.connect_s3(your_s3_key, your_s3_key_secret)
b = c.get_bucket(bucket, validate=False)

#download the file
url = "http://en.wikipedia.org/static/images/project-logos/enwiki.png"
r = requests.get(url)
if r.status_code == 200:
    #upload the file
    k = Key(b)
    k.key = "image1.png"
    k.content_type = r.headers['content-type']
    k.set_contents_from_string(r.content)


回答2:

You could use BytesIO from the Python standard library.

from io import BytesIO
bytesIO = BytesIO()
bytesIO.write('whee')
bytesIO.seek(0)
s3_file.set_contents_from_file(bytesIO)


回答3:

The boto library's Key object has several methods you might be interested in:

  • send_file
  • set_contents_from_file
  • set_contents_from_string
  • set_contents_from_stream

For an example of using set_contents_from_string, see Storing Data section of the boto documentation, pasted here for completeness:

>>> from boto.s3.key import Key
>>> k = Key(bucket)
>>> k.key = 'foobar'
>>> k.set_contents_from_string('This is a test of S3')


回答4:

I assume you're using boto. boto's Bucket.set_contents_from_file() will accept a StringIO object, and any code you have written to write data to a file should be easily adaptable to write to a StringIO object. Or if you generate a string, you can use set_contents_from_string().



回答5:

def upload_to_s3(url, **kwargs):
    '''
    :param url: url of image which have to upload or resize to upload
    :return: url of image stored on aws s3 bucket
    '''

    r = requests.get(url)
    if r.status_code == 200:
        # credentials stored in settings AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY
        conn = boto.connect_s3(AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, host=AWS_HOST)

        # Connect to bucket and create key
        b = conn.get_bucket(AWS_Bucket_Name)
        k = b.new_key("{folder_name}/{filename}".format(**kwargs))

        k.set_contents_from_string(r.content, replace=True,
                                   headers={'Content-Type': 'application/%s' % (FILE_FORMAT)},
                                   policy='authenticated-read',
                                   reduced_redundancy=True)

        # TODO Change AWS_EXPIRY
        return k.generate_url(expires_in=AWS_EXPIRY, force_http=True)


回答6:

I had a dict object which I wanted to store as a json file on S3, without creating a local file. The below code worked for me:

from smart_open import smart_open

with smart_open('s3://access-key:secret-key@bucket-name/file.json', 'wb') as fout:
    fout.write(json.dumps(dict_object).encode('utf8'))


回答7:

You can try using smart_open (https://pypi.org/project/smart_open/). I used it exactly for that: writing files directly in S3.



回答8:

Given that encryption at rest is a much desired data standard now, smart_open does not support this afaik



回答9:

Update for boto3:

aws_session = boto3.Session('my_access_key_id', 'my_secret_access_key')
s3 = aws_session.resource('s3')
s3.Bucket('my_bucket').put_object(Key='file_name.txt', Body=my_file)


回答10:

I am having a similar issue, was wondering if there was a final answer, because with my code below , the "starwars.json" keeps on saving locally but I just want to push through each looped .json file into S3 and have no file stored locally.

for key, value in star_wars_actors.items():

response = requests.get('http:starwarsapi/' + value)



data = response.json()


with open("starwars.json", "w+") as d:
    json.dump(data, d, ensure_ascii=False, indent=4)



s3.upload_file('starwars.json', 'test-bucket',
               '%s/%s' % ('test', str(key) + '.json'))