I am trying to serialize a DataTable to XML and then upload it to Azure blob storage.
The below code works, but seems clunky and memory hungry. Is there a better way to do this? I'm especially referring to the fact that I am dumping a memory stream to a byte array and then creating a new memory stream from it.
var container = blobClient.GetContainerReference("container");
var blockBlob = container.GetBlockBlobReference("blob");
byte[] blobBytes;
using (var writeStream = new MemoryStream())
{
using (var writer = new StreamWriter(writeStream))
{
table.WriteXml(writer, XmlWriteMode.WriteSchema);
}
blobBytes = writeStream.ToArray();
}
using (var readStream = new MemoryStream(blobBytes))
{
blockBlob.UploadFromStream(readStream);
}
New answer:
I've learned of a better approach, which is to open a write stream directly to the blob. For example:
Per our developer, this does not require the entire table to be buffered in-memory, and will probably encounter less copying around of data.
Original answer:
You can use the CloudBlockBlob.UploadFromByteArray method, and upload the byte array directly, instead of creating the second stream.
See https://msdn.microsoft.com/en-us/library/microsoft.windowsazure.storage.blob.cloudblockblob.uploadfrombytearray.aspx for the method syntax.