Save large files from php stdin

2019-05-10 13:40发布

Advise me the most optimal way to save large files from php stdin, please. iOS developer sends me large video content to server and i have to store it in to files.

I read the stdin thread with video data and write it to the file. For example, in this way:

$handle = fopen("php://input", "rb");    
while (!feof($handle)) {
      $http_raw_post_data .= fread($handle, 8192);
}

What function is better to use? file_get_contents or fread or something else?

标签: php post stream
3条回答
Animai°情兽
2楼-- · 2019-05-10 14:31

If you use nginx as web server i want to recommend nginx upload module with possibility to resume upload.

查看更多
萌系小妹纸
3楼-- · 2019-05-10 14:33

Don't use file_get_contents because it would attempt to load all the content of the file into a string

FROM PHP DOC

file_get_contents() is the preferred way to read the contents of a file into a string. It will use memory mapping techniques if supported by your OS to enhance performance.

Am sure you just want to create the movie file on your server .. this is a more efficient way

$in = fopen("php://input", "rb");
$out = fopen('largefile.dat', 'w');

while ( ! feof($in) ) {
    fwrite($out, fread($in, 8192));
}
查看更多
放我归山
4楼-- · 2019-05-10 14:35

I agree with @hek2mgl that treating this as a multipart form upload would make most sense, but if you can't alter your input interface then you can use file_put_contents() on a stream instead of looping through the file yourself.

$handle = fopen("php://input", "rb");  
if (false === file_put_contents("outputfile.dat", $handle))
{
  // handle error
}
fclose($handle);

It's cleaner than iterating through the file, and it might be faster (I haven't tested).

查看更多
登录 后发表回答