This question already has an answer here:
-
Downloading large files reliably in PHP
13 answers
Using PHP, I am trying to serve large files (up to possibly 200MB) which aren't in a web accessible directory due to authorization issues. Currently, I use a readfile()
call along with some headers to serve the file, but it seems that PHP is loading it into memory before sending it. I intend to deploy on a shared hosting server, which won't allow me to use much memory or add my own Apache modules such as X-Sendfile.
I can't let my files be in a web accessible directory for security reasons. Does anybody know a method that is less memory intensive which I could deploy on a shared hosting server?
EDIT:
if(/* My authorization here */) {
$path = "/uploads/";
$name = $row[0]; //This is a MySQL reference with the filename
$fullname = $path . $name; //Create filename
$fd = fopen($fullname, "rb");
if ($fd) {
$fsize = filesize($fullname);
$path_parts = pathinfo($fullname);
$ext = strtolower($path_parts["extension"]);
switch ($ext) {
case "pdf":
header("Content-type: application/pdf");
break;
case "zip":
header("Content-type: application/zip");
break;
default:
header("Content-type: application/octet-stream");
break;
}
header("Content-Disposition: attachment; filename=\"".$path_parts["basename"]."\"");
header("Content-length: $fsize");
header("Cache-control: private"); //use this to open files directly
while(!feof($fd)) {
$buffer = fread($fd, 1*(1024*1024));
echo $buffer;
ob_flush();
flush(); //These two flush commands seem to have helped with performance
}
}
else {
echo "Error opening file";
}
fclose($fd);
If you use fopen
and fread
instead of readfile
, that should solve your problem.
There's a solution in the PHP's readfile
documentation showing how to use fread
to do what you want.
To download large files from server, I have changed the below settings in php.ini file:
Upload_max_filesize - 1500 M
Max_input_time - 1000
Memory_limit - 640M
Max_execution_time - 1800
Post_max_size - 2000 M
Now, I am able to upload and download 175MB video on server.
Since, I have the dedicated server. So, making these changes were easy.
Below is the PHP script to download the file. I have no made any changes in this code snippet for large file size.
// Begin writing headers
ob_clean(); // Clear any previously written headers in the output buffer
if($filetype=='application/zip')
{
if(ini_get('zlib.output_compression'))
ini_set('zlib.output_compression', 'Off');
$fp = @fopen($filepath, 'rb');
if (strstr($_SERVER['HTTP_USER_AGENT'], "MSIE"))
{
header('Content-Type: "$content_type"');
header('Content-Disposition: attachment; filename="'.$filename.'"');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header("Content-Transfer-Encoding: binary");
header('Pragma: public');
header("Content-Length: ".filesize(trim($filepath)));
}
else
{
header('Content-Type: "$content_type"');
header('Content-Disposition: attachment; filename="'.$filename.'"');
header("Content-Transfer-Encoding: binary");
header('Expires: 0');
header('Pragma: no-cache');
header("Content-Length: ".filesize(trim($filepath)));
}
fpassthru($fp);
fclose($fp);
}
elseif($filetype=='audio'|| $filetype=='video')
{
global $mosConfig_absolute_path,$my;
ob_clean();
header("Pragma: public");
header('Expires: 0');
header('Cache-Control: no-store, no-cache, must-revalidate');
header('Cache-Control: pre-check=0, post-check=0, max-age=0');
header("Cache-Control: public");
header("Content-Description: File Transfer");
header("Content-Type: application/force-download");
header("Content-Type: $content_type");
header("Content-Length: ".filesize(trim($filepath)));
header("Content-Disposition: attachment; filename=\"$filename\"");
// Force the download
header("Content-Transfer-Encoding: binary");
@readfile($filepath);
}
else{ // for all other types of files except zip,audio/video
ob_clean();
header("Pragma: public");
header('Expires: 0');
header('Cache-Control: no-store, no-cache, must-revalidate');
header('Cache-Control: pre-check=0, post-check=0, max-age=0');
header("Cache-Control: public");
header("Content-Description: File Transfer");
header("Content-Type: $content_type");
header("Content-Length: ".filesize(trim($filepath)));
header("Content-Disposition: attachment; filename=\"$filename\"");
// Force the download
header("Content-Transfer-Encoding: binary");
@readfile($filepath);
}
exit;
If you care about performance, there is xsendfile, available in apache, nginx and lighttpd as module. Check the readfile()
doc's users comments.
There are also modules for these webservers which accept a url with an additional hash value which allows downloading the file for a short time period. This can be also used to solve authorization issues.
You could also handle this in the style of the Gordian Knot - that is to say, sidestep the problem entirely. Keep the files in a non-accessible directory, and when a download is initiated you can simply
tempstring = rand();
symlink('/filestore/filename.extension', '/www/downloads'.tempstring.'-filename.extension');
echo("Your download is available here: <a href='/downloads/'.tempstring.'-filename.extension');
and setup a cronjob to unlink()
any download links older than 10 minutes. Virtually no processing of your data is required, no massaging of HTTP headers, etc.
There are even a couple libraries out there for just this purpose.