GCS - Heap Error When File of higher size was uplo

2019-09-10 22:30发布

问题:

I want to upload file size of 400MB to Google Cloud Storage. Following is the code snippet that I am using to get upload url and for posting in angularjs

Java Servlet:

UploadOptions opts = UploadOptions.Builder
        .withGoogleStorageBucketName(GCS_BUCKET_NAME);
String url = blobstore.createUploadUrl("/upload", opts);

AngularJS:

$scope.formData.append('file', $scope.filename);

$http.post(uploadUrl, $scope.formData, {
    withCredentials: true,
    headers: {
        'Content-Type': undefined
    },
    transformRequest: angular.identity
})

The code is working fine till the file size is around 100-150MB, when the file size is higher I got the following error in browser console.

Failed to load resource: the server responded with a status of 500 (Java heap space)

What I need to do to avoid this error?

回答1:

As you said in your response to my question above, this is only happening in development, not in production. My guess is that google's blobstore code reads the whole request into memory in their development server, but it's hard to say because we don't have access to that code. If you need to test larger uploads in dev, you could try configuring your dev server to launch with more memory. Looks like I use -Xmx512M (in the "VM arguments" section of the launch configuration in eclipse), you could even try more. Or if it's not important in dev, you can just test smaller uploads and trust that large ones will work in production. :)

FYI if you decide to launch the dev server with more memory: In my case, using Eclipse and Maven together, I had to create a copy of the launch configuration that Maven generated automatically and modify that one, otherwise Maven would overwrite my changes. That also means I have to update my copy from time to time by hand, such as when updating the appengine SDK (to reflect the new version number).