This question already has an answer here:
- Downloading large files reliably in PHP 13 answers
Using PHP, I am trying to serve large files (up to possibly 200MB) which aren't in a web accessible directory due to authorization issues. Currently, I use a readfile()
call along with some headers to serve the file, but it seems that PHP is loading it into memory before sending it. I intend to deploy on a shared hosting server, which won't allow me to use much memory or add my own Apache modules such as X-Sendfile.
I can't let my files be in a web accessible directory for security reasons. Does anybody know a method that is less memory intensive which I could deploy on a shared hosting server?
EDIT:
if(/* My authorization here */) {
$path = "/uploads/";
$name = $row[0]; //This is a MySQL reference with the filename
$fullname = $path . $name; //Create filename
$fd = fopen($fullname, "rb");
if ($fd) {
$fsize = filesize($fullname);
$path_parts = pathinfo($fullname);
$ext = strtolower($path_parts["extension"]);
switch ($ext) {
case "pdf":
header("Content-type: application/pdf");
break;
case "zip":
header("Content-type: application/zip");
break;
default:
header("Content-type: application/octet-stream");
break;
}
header("Content-Disposition: attachment; filename=\"".$path_parts["basename"]."\"");
header("Content-length: $fsize");
header("Cache-control: private"); //use this to open files directly
while(!feof($fd)) {
$buffer = fread($fd, 1*(1024*1024));
echo $buffer;
ob_flush();
flush(); //These two flush commands seem to have helped with performance
}
}
else {
echo "Error opening file";
}
fclose($fd);
You could also handle this in the style of the Gordian Knot - that is to say, sidestep the problem entirely. Keep the files in a non-accessible directory, and when a download is initiated you can simply
and setup a cronjob to
unlink()
any download links older than 10 minutes. Virtually no processing of your data is required, no massaging of HTTP headers, etc.There are even a couple libraries out there for just this purpose.
To download large files from server, I have changed the below settings in php.ini file:
Now, I am able to upload and download 175MB video on server. Since, I have the dedicated server. So, making these changes were easy.
Below is the PHP script to download the file. I have no made any changes in this code snippet for large file size.
If you use
fopen
andfread
instead ofreadfile
, that should solve your problem.There's a solution in the PHP's
readfile
documentation showing how to usefread
to do what you want.If you care about performance, there is xsendfile, available in apache, nginx and lighttpd as module. Check the
readfile()
doc's users comments.There are also modules for these webservers which accept a url with an additional hash value which allows downloading the file for a short time period. This can be also used to solve authorization issues.