Streaming file from S3 with Express including info

2020-02-17 03:33发布

Using the aws-sdk module and Express 4.13, it's possible to proxy a file from S3 a number of ways.

This callback version will return the file body as a buffer, plus other relevant headers like Content-Length:

function(req,res){

  var s3 = new AWS.S3();

  s3.getObject({Bucket: myBucket, Key: myFile},function(err,data){

    if (err) {
      return res.status(500).send("Error!");
    }

    // Headers
    res.set("Content-Length",data.ContentLength)
       .set("Content-Type",data.ContentType);

    res.send(data.Body); // data.Body is a buffer

  });

}

The problem with this version is that you have to get the entire file before sending it, which is not great, especially if it's something large like a video.

This version will directly stream the file:

function(req,res){

  var s3 = new AWS.S3();

  s3.getObject({Bucket: myBucket, Key: myFile})
    .createReadStream()
    .pipe(res);

}

But unlike the first one, it won't do anything about the headers, which a browser might need to properly handle the file.

Is there a way to get the best of both worlds, passing through the correct headers from S3 but sending the file as a stream? It could be done by first making a HEAD request to S3 to get the metadata, but can it be done with one API call?

3条回答
趁早两清
2楼-- · 2020-02-17 03:51

For my project, I simply do a headObject in order to retrieve the object metadata only (it's really fast and avoid to download the object). Then I add in the response all the headers I need to propagate for the piping:

    var s3 = new AWS.S3();

    var params = {
        Bucket: bucket,
        Key: key
    };
    s3.headObject(params, function (err, data) {
        if (err) {
            // an error occurred
            console.error(err);
            return next();
        }
        var stream = s3.getObject(params).createReadStream();

        // forward errors
        stream.on('error', function error(err) {
            //continue to the next middlewares
            return next();
        });

        //Add the content type to the response (it's not propagated from the S3 SDK)
        res.set('Content-Type', mime.lookup(key));
        res.set('Content-Length', data.ContentLength);
        res.set('Last-Modified', data.LastModified);
        res.set('ETag', data.ETag);

        stream.on('end', () => {
            console.log('Served by Amazon S3: ' + key);
        });
        //Pipe the s3 object to the response
        stream.pipe(res);
    });
查看更多
Summer. ? 凉城
3楼-- · 2020-02-17 03:55

One approach is listening the httpHeaders event and creating a stream within it.

s3.getObject(params)
    .on('httpHeaders', function (statusCode, headers) {
        res.set('Content-Length', headers['content-length']);
        res.set('Content-Type', headers['content-type']);
        this.response.httpResponse.createUnbufferedStream()
            .pipe(res);
    })
    .send();
查看更多
叼着烟拽天下
4楼-- · 2020-02-17 03:58

Building on André Werlang's answer, we have done the following to augment AWS Request objects with a forwardToExpress method:

const _ = require('lodash');
const AWS = require('aws-sdk');

AWS.Request.prototype.forwardToExpress = function forwardToExpress(res, next) {
    this
    .on('httpHeaders', function (code, headers) {
        if (code < 300) {
            res.set(_.pick(headers, 'content-type', 'content-length', 'last-modified'));
        }                            
    })
    .createReadStream()
    .on('error', next)
    .pipe(res);
};    

Then, in our route handlers, we can do something like this:

s3.getObject({Bucket: myBucket, Key: myFile}).forwardToExpress(res, next);
查看更多
登录 后发表回答