I have a php script on a server to send files to recipents: they get a unique link and then they can download large files. Sometimes there is a problem with the transfer and the file is corrupted or never finishes. I am wondering if there is a better way to send large files
Code:
$f = fopen(DOWNLOAD_DIR.$database[$_REQUEST['fid']]['filePath'], 'r');
while(!feof($f)){
print fgets($f, 1024);
}
fclose($f);
I have seen functions such as
http_send_file
http_send_data
But I am not sure if they will work.
What is the best way to solve this problem?
Regards
erwing
If you are using lighttpd as a webserver, an alternative for secure downloads would be to use ModSecDownload. It needs server configuration but you'll let the webserver handle the download itself instead of the PHP script.
Generating the download URL would look like that (taken from the documentation) and it could of course be only generated for authorized users:
Of course, depending on the size of the files, using
readfile()
such as proposed by Unkwntech is excellent. And using xsendfile as proposed by garrow is another good idea also supported by Apache.Chunking files is the fastest / simplest method in PHP, if you can't or don't want to make use of something a bit more professional like cURL,
mod-xsendfile
on Apache or some dedicated script.Ported from richnetapps.com / NeedBee. Tested on 200 MB files, on which
readfile()
died, even with maximum allowed memory limit set to1G
, that is five times more than downloaded file size.BTW: I tested this also on files
>2GB
, but PHP only managed to write first2GB
of file and then broke the connection. File-related functions (fopen, fread, fseek) uses INT, so you ultimately hit the limit of2GB
. Above mentioned solutions (i.e.mod-xsendfile
) seems to be the only option in this case.EDIT: Make yourself 100% that your file is saved in
utf-8
. If you omit that, downloaded files will be corrupted. This is, because this solutions usesprint
to push chunk of a file to a browser.I'm not sure this is a good idea for large files. If the thread for your download script runs until the user has finished the download, and you're running something like Apache, just 50 or more concurrent downloads could crash your server, because Apache isn't designed to run large numbers of long-running threads at the same time. Of course I might be wrong, if the apache thread somehow terminates and the download sits in a buffer somewhere whilst the download progresses.
I have had same problem, my problem solved by adding this before starting session session_cache_limiter('none');
We've been using this in a couple of projects and it works quite fine so far: