How do I upload some file into Azure blob storage

2019-03-09 07:19发布

问题:

I created an Azure Storage account. I have a 400 megabytes .zip file that I want to put into blob storage for later use.

How can I do that without writing code? Is there some interface for that?

回答1:

Free tools:

  1. Visual Studio 2010 -- install Azure tools and you can find the blobs in the Server Explorer
  2. Cloud Berry Lab's CloudBerry Explorer for Azure Blob Storage
  3. ClumpsyLeaf CloudXplorer
  4. Azure Storage Explorer from CodePlex (try version 4 beta)

There was an old program called Azure Blob Explorer or something that no longer works with the new Azure SDK.

Out of these, I personally like CloudBerry Explorer the best.



回答2:

The easiest way is to use Azure Storage PowerShell. It provided many commands to manage your storage container/blob/table/queue.

For your mentioned case, you could use Set-AzureStorageBlobContent which could upload a local file into azure storage as a block blob or page blob.

Set-AzureStorageBlobContent -Container containerName -File .\filename -Blob blobname

For details, please refer to http://msdn.microsoft.com/en-us/library/dn408487.aspx.



回答3:

If you're looking for a tool to do so, may I suggest that you take a look at our tool Cloud Storage Studio (http://www.cerebrata.com/Products/CloudStorageStudio). It's a commercial tool for managing Windows Azure Storage and Hosted Service. You can also find a comprehensive list of Windows Azure Storage Management tools here: http://blogs.msdn.com/b/windowsazurestorage/archive/2010/04/17/windows-azure-storage-explorers.aspx

Hope this helps.



回答4:

The StorageClient has this built into it. No need to write really anything:

var account = new CloudStorageAccount(creds, false);

var client = account.CreateCloudBlobClient();

var blob = client.GetBlobReference("/somecontainer/hugefile.zip");

//1MB seems to be a pretty good all purpose size
client.WriteBlockSizeInBytes = 1024;

//this sets # of parallel uploads for blocks
client.ParallelOperationThreadCount = 4; //normally set to one per CPU core

//this will break blobs up automatically after this size
client.SingleBlobUploadThresholdInBytes = 4096;

blob.UploadFile("somehugefile.zip");


回答5:

There is a new OpenSource tool provided by Microsoft :

  • Project Deco - Crossplatform Microsoft Azure Storage Account Explorer.

Please, check those links:

  • Download binaries: http://storageexplorer.com/
  • Source Code: https://github.com/Azure/deco


回答6:

I use Cyberduck to manage my blob storage.

It is free and very easy to use. It works with other cloud storage solutions as well.

I recently found this one as well: CloudXplorer

Hope it helps.



回答7:

You can use Cloud Combine for reliable and quick file upload to Azure blob storage.



回答8:

A simple batch file using Microsoft's AzCopy utility will do the trick. You can drag-and-drop your files on the following batch file to upload into your blob storage container:

upload.bat

@ECHO OFF

SET BLOB_URL=https://<<<account name>>>.blob.core.windows.net/<<<container name>>>
SET BLOB_KEY=<<<your access key>>>

:AGAIN
IF "%~1" == "" GOTO DONE

AzCopy /Source:"%~d1%~p1" /Dest:%BLOB_URL% /DestKey:%BLOB_KEY% /Pattern:"%~n1%~x1" /destType:blob

SHIFT
GOTO AGAIN

:DONE
PAUSE

Note that the above technique only uploads one or more files individually (since the Pattern flag is specified) instead of uploading an entire directory.



回答9:

You can upload large files directly to the Azure Blob Storage directly using the HTTP PUT verb, the biggest file I have tried with the code below is 4,6 Gb. You can do this in C# like this:

// write up to ChunkSize of data to the web request
void WriteToStreamCallback(IAsyncResult asynchronousResult)
{
    var webRequest = (HttpWebRequest)asynchronousResult.AsyncState;
    var requestStream = webRequest.EndGetRequestStream(asynchronousResult);
    var buffer = new Byte[4096];
    int bytesRead;
    var tempTotal = 0;

    File.FileStream.Position = DataSent;

    while ((bytesRead = File.FileStream.Read(buffer, 0, buffer.Length)) != 0
        && tempTotal + bytesRead < CHUNK_SIZE 
        && !File.IsDeleted 
        && File.State != Constants.FileStates.Error)
    {
        requestStream.Write(buffer, 0, bytesRead);
        requestStream.Flush();

        DataSent += bytesRead;
        tempTotal += bytesRead;

        File.UiDispatcher.BeginInvoke(OnProgressChanged);
    }

    requestStream.Close();

    if (!AbortRequested) webRequest.BeginGetResponse(ReadHttpResponseCallback, webRequest);
}

void StartUpload()
{
    var uriBuilder = new UriBuilder(UploadUrl);

    if (UseBlocks)
    {
        // encode the block name and add it to the query string
        CurrentBlockId = Convert.ToBase64String(Encoding.UTF8.GetBytes(Guid.NewGuid().ToString()));
        uriBuilder.Query = uriBuilder.Query.TrimStart('?') + string.Format("&comp=block&blockid={0}", CurrentBlockId);
    }

    // with or without using blocks, we'll make a PUT request with the data
    var webRequest = (HttpWebRequest)WebRequestCreator.ClientHttp.Create(uriBuilder.Uri);
    webRequest.Method = "PUT";
    webRequest.BeginGetRequestStream(WriteToStreamCallback, webRequest);
}

The UploadUrl is generated by Azure itself and contains a Shared Access Signature, this SAS URL says where the blob is to be uploaded to and how long time the security access (write access in your case) is given. You can generate a SAS URL like this:

readonly CloudBlobClient BlobClient;
readonly CloudBlobContainer BlobContainer;

public UploadService()
{
    // Setup the connection to Windows Azure Storage
    var storageAccount = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
    BlobClient = storageAccount.CreateCloudBlobClient();

    // Get and create the container
    BlobContainer = BlobClient.GetContainerReference("publicfiles");
}

string JsonSerializeData(string url)
{
    var serializer = new DataContractJsonSerializer(url.GetType());
    var memoryStream = new MemoryStream();

    serializer.WriteObject(memoryStream, url);

    return Encoding.Default.GetString(memoryStream.ToArray());
}

public string GetUploadUrl()
{
    var sasWithIdentifier = BlobContainer.GetSharedAccessSignature(new SharedAccessPolicy
    {
        Permissions = SharedAccessPermissions.Write,
        SharedAccessExpiryTime =
            DateTime.UtcNow.AddMinutes(60)
    });
    return JsonSerializeData(BlobContainer.Uri.AbsoluteUri + "/" + Guid.NewGuid() + sasWithIdentifier);
}

I also have a thread on the subject where you can find more information here How to upload huge files to the Azure blob from a web page



回答10:

You can upload files to Azure Storage Account Blob using Command Prompt.

Install Microsoft Azure Storage tools.

And then Upload it to your account blob will CLI command:

AzCopy /Source:"filepath" /Dest:bloburl /DestKey:accesskey /destType:blob

Hope it Helps.. :)



回答11:

I've used all the tools mentioned in post, and all work moderately well with block blobs. My favorite however is BlobTransferUtility

By default BlobTransferUtility only does block blobs. However changing just 2 lines of code and you can upload page blobs as well. If you, like me, need to upload a virtual machine image it needs to be a page blob.

(for the difference please see this MSDN article.)

To upload page blobs just change lines 53 and 62 of BlobTransferHelper.cs from

new Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob

to

new Microsoft.WindowsAzure.Storage.Blob.CloudPageBlob

The only other thing to know about this app is to uncheck HELP when you first run the program to see the actual UI.



回答12:

Check out this post Uploading to Azure Storage where it is explained how to easily upload any file via PowerShell to Azure Blob Storage.



回答13:

You can use Azcopy tool to upload the required files to the azure default storage is block blob u can change pattern according to your requirement

Syntax

AzCopy /Source :  /Destination /s


回答14:

The new Azure Portal has an 'Editor' menu option in preview when in the container view. Allows you to upload a file directly to the container from the Portal UI



回答15:

Try the Blob Service API

http://msdn.microsoft.com/en-us/library/dd135733.aspx

However, 400mb is a large file and I am not sure a single API call will deal with something of this size, you may need to split it and reconstruct using custom code.