I want to achieve the functionality where I can upload a certain part of a file to Google Drive. I am able to do this by splitting the original read stream into multiple write steams and then uploading them, but this technique isn't acceptable since the write streams will live on the server (which I don't want).
I've tried to overcome this by sending the original read stream in the body of the upload request but I can't find a way to stop the upload when a certain condition is met. My upload function is as follows:
var upload = (auth, google, fileName, readStream, res, lastChunk) => {
console.log(`uploading ${fileName}`);
const drive = google.drive({ version: 'v3', auth });
var fileMetadata = {
'name': fileName
};
var media = {
mimeType: 'application/octet-stream',
body: readStream
};
drive.files.create({
resource: fileMetadata,
media: media,
fields: 'id',
headers: {
'uploadType': 'multipart'
}
}, {
onUploadProgress: function (e) {
if (e.bytesRead > 786432){
// want to stop uploading
//readStream.close() <- this doesn't stop the upload
}
console.log(`Uploaded ${fileName}:`, e.bytesRead.toString());
}
}, function (err, file) {
if (err) {
console.error(err);
} else {
if (lastChunk)
res.render('upload-success.hbs');
console.log('File Id: ', file.id);
}
});
Is there a way I can stop uploading and have the uploaded part intact on Google Drive? I've tried multiple actions on the stream including closing, pausing but none seem to stop the upload. Something I would like to add is that if I've read an X bytes of stream of and upload it, then the correct bytes are uploaded, i.e. bytes from X on wards.
Based on this blog, you can add a handler for a request part body chunk received. Something like this:
All I/O calls are asynchronous, so write method is not immediately executed. Therefore, use
addCallback
method ofevents.Promise
to process the method calls and to use for notification. It says that to achieve robust functioning,request.resume()
andrequest.pause()
is needed to avoid file corruption.