I have web application for running unit tests for our data and I want to deploy it as Azure Web Site.
The problem is in this app I'm downloading quite large zip files, extracting them (~50MB, 500 files inside) and doing some tests over these files.
Where should I save these large files on Azure Web Sites and where hould I extract them? On localhost I've been using "Path.GetTempPath()", but Azure Web Site is reporting that there is no space in this folder, even though my Azure Site has 1000MB total and about 990MB free.
Is there any way how to use these 1000MB for my file operations?
In case this is not possible, should I use the Azure Blob Storage for the extracted files?
相关问题
- Sorting 3 numbers without branching [closed]
- Graphics.DrawImage() - Throws out of memory except
- Carriage Return (ASCII chr 13) is missing from tex
- Why am I getting UnauthorizedAccessException on th
- running headless chrome in an microsoft azure web
In case of Web Sites and when your storage requirements fit withing the constraints of provided local storage - you certainly can use local storage.
However
Path.GetTempPath()
is not your best choice for Azure Web Site. I would say you shall put all the files in a folder which is part of your web app root folder, i.e.Server.MapPath("~/tmp/")
. Make sure to first check for folder existence, etc. There you can utilize all the storage you have.As for Blob - you have to unzip each file separately and upload it separately to a blob. And when you have to work with the files, you have to download them again. I don't believe this is real solution, as long as you have enough local storage you can utilize.