I have the following script, which works good, locally (Windows 10 IIS, windows 2003 Server), but not on our hosting server (Windows 2003 Server). Anything over 4mb will download really slow and then timeout before it gets to the end of the file. However, locally, it downloads fast and full.
Doing a Direct Download (link to the file itself) downloads a 26.5mb file in 5 seconds from our hosting provider server. So, there is not an issue with a download limit. There is an issue it seems, with the hosting server and this script. Any ideas?
Response.AddHeader "content-disposition","filename=" & strfileName
Response.ContentType = "application/x-zip-compressed" 'here your content -type
Dim strFilePath, lSize, lBlocks
'Const CHUNK = 2048
' Thanks to Lankymart. I have set this and it download at 1.5MB a second, so that is running pretty well for what I need it to be.
Const CHUNK = 2048000
set objStream = CreateObject("ADODB.Stream")
objStream.Open
objStream.Type = 1
objStream.LoadFromfile Server.MapPath("up/"&strfileName&"")
lSize = objStream.Size
Response.AddHeader "Content-Size", lSize
lBlocks = 1
Response.Buffer = False
Do Until objStream.EOS Or Not Response.IsClientConnected
Response.BinaryWrite(objStream.Read(CHUNK))
Loop
objStream.Close
Just looking at the code snippet it appear to be fine and is the very approach I would use for downloading large files (especially like the use of Response.IsClientConnected
).
However having said that, it's likely the size of the chunks being read in relation to the size of the file.
Very roughly the formula is something like this...
time to read = ((file size / chunk size) * read time)
So if we use your example of a 4 MB file (4194304 bytes) and say it takes 100 milliseconds to read each chunk then the following applies;
Classic ASP pages on IIS 7 and above have a default scriptTimeout
of 00:01:30
so in the example above a 4 MB file constantly read at 100 milliseconds in 2 KB chunks would timeout before the script could finish.
Now these are just rough statistics your read time won't constantly stay the same and it's likely faster then 100 milliseconds (depending on disk read speeds) but I think you get the point.
So just try increasing the CHUNK
.
Const CHUNK = 20480 'Read in chunks of 20 KB
The code I have is bit different, using a For..Next loop instead of Do..Until loop. Not 100% sure this will really work in your case, but worth a try. Here is my version of the code:
For i = 1 To iSz / chunkSize
If Not Response.IsClientConnected Then Exit For
Response.BinaryWrite objStream.Read(chunkSize)
Next
If iSz Mod chunkSize > 0 Then
If Response.IsClientConnected Then
Response.BinaryWrite objStream.Read(iSz Mod chunkSize)
End If
End If
Basically is due the script timeout. I had the same problem with 1GB files in IIS 10 after upgraded to Win 2016 with IIS 10 (default timeout is shorter by default).
I use chunks of 256000 and Server.ScriptTimeout = 600 '10 minutes