Increase Azure blob block upload limit from 32 MB

2019-05-29 09:39发布

I am trying to upload a contents to azure blob and the size is over 32MB. The c# code snippet below:

CloudBlockBlob blob = _blobContainer.GetBlockBlobReference(blobName);
blob.UploadFromByteArray(contents, 0, contents.Length, AccessCondition.GenerateIfNotExistsCondition(), options:writeOptions);

Everytime the blob is over 32MB, the above raises an exception:

Exception thrown: 'Microsoft.WindowsAzure.Storage.StorageException' in Microsoft.WindowsAzure.Storage.dll

Additional information: The remote server returned an error: (404) Not Found.

As per this

When a block blob upload is larger than the value in this property, storage clients break the file into blocks.

Should there be a separate line of code to enable this.

1条回答
太酷不给撩
2楼-- · 2019-05-29 10:33

Storage clients default to a 32 MB maximum single block upload. When a block blob upload is larger than the value in SingleBlobUploadThresholdInBytes property, storage clients break the file into blocks.

As Tamra said, the storage client handles the work of breaking the file into blocks. Here is my tests for you to have a better understanding of it.

Code Sample

CloudBlockBlob blob = container.GetBlockBlobReference(blobName);
var writeOptions = new BlobRequestOptions()
{
    SingleBlobUploadThresholdInBytes = 50 * 1024 * 1024, //maximum for 64MB,32MB by default          
};
blob.UploadFromByteArray(contents, 0, contents.Length, AccessCondition.GenerateIfNotExistsCondition(), options: writeOptions);

Scenario

  1. If you are writing a block blob that is no more than the SingleBlobUploadThresholdInBytes property in size, you could upload it in its entirety with a single write operation.

    You could understand it by capturing the Network Package via Fiddler when you invoke the UploadFromByteArray method.

  2. When a block blob upload is larger than the value in SingleBlobUploadThresholdInBytes property in size, storage clients break the file into blocks automatically.

    I upload a blob which size is nearly 90MB, then you could find the difference as follows:

    Upon the snapshot, you could find that storage clients break the file into blocks with 4MB in size and upload the blocks simultaneously.

Every time the blob is over 32MB, the above raises an exception

You could try to set the SingleBlobUploadThresholdInBytes property or capture the Network Package when you invoke the UploadFromByteArray method to find the detailed error.

查看更多
登录 后发表回答