I have a web service that takes a byte[] and saves it.
This works fine for "small" files, but once I hit a certain size the web service fails and returns "The request failed with HTTP status 404: Not Found."
From what I've seen this appears to be an IIS setting that limits the size of a file that can be posted (to prevent Denial of Service attacks). I've tried to increase that setting, but I am having trouble determining what setting and where/how one would set it. I am using IIS7 and the webservice is done in .net (asmx).
In the web.config of the web service I have added the following (which seemed to increase the size of file that can be accepted, but not all the way to this setting size)
<system.web>
<httpRuntime executionTimeout="999999" maxRequestLength="2097151" />
...
</system.web>
Any suggestions on where (and how) to increase the size of file that the web service would be greatly appreciated.
Just to add information to people googling this web.config:
C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\12\ISAPI
This solved our problem after troubleshooting this issue for quite som time.
maxRequestLength is in KB, not bytes. This should give you a 30 MB limit within a 4-minute timeout window.
Having numbers that are too high may be actually preventing your values from being applied. I think I ran into this a few years ago when I thought it was a byte limit (vague memory).
If you're set on using Web Services to move around files I would at least consider using WS-Attachment / DIME attachments. The primary problem with sending byte[]'s over web services is that they get put in the SOAP body which is gets encoded as a base 64 string. Encoding files like this grows the size of the file by as much as two thirds in the soap body (ie. a 6 MB file becomes a 9 MB file over the wire).
It's likely that your 25 MB upload is turning into HUGE soap envelopes.
I'd strongly suggest reading this. Which might get you into DIME.
Here's an excerpt.
Hope that helps!
this worked for me:
If I was stuck having to use web services and needed to support very large files I would look at implementing a system that allows you to upload files in pieces.
Eg.
This would allow you to chunk up the big uploads, and not hold too much data in memory. The disadvantage is that you are still using a fairly inefficient transport mechanism.
This doesn't specifically answer you question, but what I've done in the past is use WCF to transfer file names/paths/listings, but then use an FTP library to transfer the file via FTP.