Firebase cloud function for file upload

2019-01-14 23:20发布

问题:

I have this cloud function that I wrote to upload file to google cloud storage:

const gcs = require('@google-cloud/storage')({keyFilename:'2fe4e3d2bfdc.json'});

var filePath = file.path + "/" + file.name;

    return bucket.upload(filePath, {
        destination: file.name
    }).catch(reason => {
            console.error(reason);
    });

I used formidable to parse the uploaded file and I tried to log the properties of the uploaded file and it seems fine; it is uploaded to a temp dir '/tmp/upload_2866bbe4fdcc5beb30c06ae6c3f6b1aa/ but when I try to upload the file to the gcs am getting this error:

{ Error: EACCES: permission denied, stat '/tmp/upload_2866bbe4fdcc5beb30c06ae6c3f6b1aa/thumb_ttttttt.jpg'
    at Error (native)
  errno: -13,
  code: 'EACCES',
  syscall: 'stat',
  path: '/tmp/upload_2866bbe4fdcc5beb30c06ae6c3f6b1aa/thumb_ttttttt.jpg' }

I am using this html form to upload the file:

<!DOCTYPE html>
<html>
<body>

<form action="https://us-central1-appname.cloudfunctions.net/uploadFile" method="post" enctype="multipart/form-data">
    Select image to upload:
    <input type="file" name="fileToUpload" id="fileToUpload">
    <input type="submit" value="Upload Image" name="submit">
</form>

</body>
</html>

回答1:

I got a solution from the Firebase Support Team

First, the code contains a small misunderstanding of the 'formidable' library. The line:

var filePath = file.path + "/" + file.name;

indicates that we're expecting that 'formidable' uploads our file with its original name, storing it inside a temporary directory with a random name. Actually, what 'formidable' does is upload the file to a random path stored in 'file.path', not using the original filename anymore at all. If we want we can find out what the original filename was by reading 'file.name'.

The quick fix for the code is therefore to change:

var filePath = file.path + "/" + file.name;

To be just:

var filePath = file.path;

Second, there's one more issue that can cause this code trouble: the function terminates before the asynchronous work done in 'form.parse(...)' is completed. That means the actual file upload will probably happen when the function has already said it's done, so it doesn't have any CPU or memory reserved anymore - the upload will go very slow, or maybe even fail.

The fix for that is to wrap the form.parse(...) in a promise:

exports.uploadFile = functions.https.onRequest((req, res) => {
   var form = new formidable.IncomingForm();
   return new Promise((resolve, reject) => {
     form.parse(req, function(err, fields, files) {
       var file = files.fileToUpload;
       if(!file){
         reject("no file to upload, please choose a file.");
         return;
       }
       console.info("about to upload file as a json: " + file.type);
       var filePath = file.path;
       console.log('File path: ' + filePath);

       var bucket = gcs.bucket('bucket-name');
       return bucket.upload(filePath, {
           destination: file.name
       }).then(() => {
         resolve();  // Whole thing completed successfully.
       }).catch((err) => {
         reject('Failed to upload: ' + JSON.stringify(err));
       });
     });
   }).then(() => {
     res.status(200).send('Yay!');
     return null
   }).catch(err => {
     console.error('Error while parsing form: ' + err);
     res.status(500).send('Error while parsing form: ' + err);
   });
 });

Lastly, you may want to consider using the Cloud Storage for Firebase in uploading your file instead of Cloud functions. Cloud Storage for Firebase allows you to upload files directly to it, and would work much better:
It has access control
It has resumable uploads/downloads (great for poor connectivity)
It can accept files of any size without timeout-issues
If a Cloud Function should run in response to a file upload, you can do that and a lot more



回答2:

I managed this by downloading the file to the tmp instead.

You will need:

const mkdirp = require('mkdirp-promise');

Then, inside onChange. I created tempLocalDir like so:

const LOCAL_TMP_FOLDER = '/tmp/';
const fileDir = (the name of the file); //whatever method you choose to do this
const tempLocalDir = `${LOCAL_TMP_FOLDER}${fileDir}`;

Then I use mkdirp to make the temp directory

return mkdirp(tempLocalDir).then(() => {
// Then Download file from bucket.
const bucket = gcs.bucket(object.bucket);
return bucket.file(filePath).download({
  destination: tempLocalFile
}).then(() => {
    console.log('The file has been downloaded to', tempLocalFile);
    //Here I have some code that converts images then returns the converted image
    //Then I use
    return bucket.upload((the converted image), {
        destination: (a file path in your database)
    }).then(() => {
        console.log('JPEG image uploaded to Storage at', filePath);
    })//You can perform more actions of end the promise here

I think my code achieves that you were trying to accomplish. I hope this helps; I can offer more code if necessary.