AWS Python Lambda Function - Upload File to S3

2020-07-09 11:05发布

问题:

I have an AWS Lambda function written in Python 2.7 in which I want to:

1) Grab an .xls file form an HTTP address.

2) Store it in a temp location.

3) Store the file in an S3 bucket.

My code is as follows:

from __future__ import print_function
import urllib
import datetime 
import boto3
from botocore.client import Config

def lambda_handler(event, context):

    """Make a variable containing the date format based on YYYYYMMDD"""
    cur_dt = datetime.datetime.today().strftime('%Y%m%d')

    """Make a variable containing the url and current date based on the variable
    cur_dt"""
    dls = "http://11.11.111.111/XL/" + cur_dt + ".xlsx"
    urllib.urlretrieve(dls, cur_dt + "test.xls")

    ACCESS_KEY_ID = 'Abcdefg'
    ACCESS_SECRET_KEY = 'hijklmnop+6dKeiAByFluK1R7rngF'
    BUCKET_NAME = 'my-bicket'
    FILE_NAME = cur_dt + "test.xls";

    data = open('/tmp/' + FILE_NAME, 'wb')

    # S3 Connect
    s3 = boto3.resource(
        's3',
        aws_access_key_id=ACCESS_KEY_ID,
        aws_secret_access_key=ACCESS_SECRET_KEY,
        config=Config(signature_version='s3v4')
    )

    # Uploaded File
    s3.Bucket(BUCKET_NAME).put(Key=FILE_NAME, Body=data, ACL='public-read')

However, when I run this function, I receive the following error:

'IOError: [Errno 30] Read-only file system'

I've spent hours trying to address this issue but I'm falling on my face. Any help would be appreciated.

回答1:

'IOError: [Errno 30] Read-only file system'

You seem to lack some write access right. If your lambda has another policy, try to attach this policy to your role:

arn:aws:iam::aws:policy/AWSLambdaFullAccess

It has full access on S3 as well, in case you can't write in your bucket. If it solves your issue, you'll remove some rights after that.



回答2:

I have uploaded the image to s3 Bucket. In "Lambda Test Event", I have created one json test event which contains BASE64 of Image to be uploaded to s3 Bucket and Image Name. Lambda Test JSON Event as fallows ======>

            { 
                "ImageName": "Your Image Name",
                "img64":"BASE64 of Your Image"
            }

Following is the code to upload an image or any file to s3 ======>

import boto3
import base64
def lambda_handler(event, context):
    s3 = boto3.resource(u's3')
    bucket = s3.Bucket(u'YOUR-BUCKET-NAME')
    path_test = '/tmp/output'         # temp path in lambda.
    key = event['ImageName']          # assign filename to 'key' variable
    data = event['img64']             # assign base64 of an image to data variable 
    data1 = data
    img = base64.b64decode(data1)     # decode the encoded image data (base64)
    with open(path_test, 'wb') as data:
        #data.write(data1)
        data.write(img)
        bucket.upload_file(path_test, key)   # Upload image directly inside bucket
        #bucket.upload_file(path_test, 'FOLDERNAME-IN-YOUR-BUCKET /{}'.format(key))    # Upload image inside folder of your s3 bucket.
    print('res---------------->',path_test)
    print('key---------------->',key)

    return {
            'status': 'True',
       'statusCode': 200,
       'body': 'Image Uploaded'
      }


回答3:

change data = open('/tmp/' + FILE_NAME, 'wb') change the wb for "r"

also, I assume your IAM user has full access to S3 right?

or maybe the problem is in the request of that url... try that cur_dt starts with "/tmp/" urllib.urlretrieve(dls, "/tmp/" + cur_dt + "test.xls")