Save large files from php stdin

2019-05-10 14:21发布

问题:

Advise me the most optimal way to save large files from php stdin, please. iOS developer sends me large video content to server and i have to store it in to files.

I read the stdin thread with video data and write it to the file. For example, in this way:

$handle = fopen("php://input", "rb");    
while (!feof($handle)) {
      $http_raw_post_data .= fread($handle, 8192);
}

What function is better to use? file_get_contents or fread or something else?

回答1:

Don't use file_get_contents because it would attempt to load all the content of the file into a string

FROM PHP DOC

file_get_contents() is the preferred way to read the contents of a file into a string. It will use memory mapping techniques if supported by your OS to enhance performance.

Am sure you just want to create the movie file on your server .. this is a more efficient way

$in = fopen("php://input", "rb");
$out = fopen('largefile.dat', 'w');

while ( ! feof($in) ) {
    fwrite($out, fread($in, 8192));
}


回答2:

I agree with @hek2mgl that treating this as a multipart form upload would make most sense, but if you can't alter your input interface then you can use file_put_contents() on a stream instead of looping through the file yourself.

$handle = fopen("php://input", "rb");  
if (false === file_put_contents("outputfile.dat", $handle))
{
  // handle error
}
fclose($handle);

It's cleaner than iterating through the file, and it might be faster (I haven't tested).



回答3:

If you use nginx as web server i want to recommend nginx upload module with possibility to resume upload.



标签: php post stream