Handling delays when retrieving files from remote

2019-01-28 06:40发布

I am working with PHP to access files and photos from remote servers. I am mostly using the file_get_contents() and copy() functions.

Sometimes accessing a small text file or photo is almost instant, but other times it seems to get "stuck" for a minute on the same exact file. And sometimes it actually causes my script to hang, and even when I stop the script Apache remains locked up for several minutes.

I'm quite willing to accept the fact that internet connections can be flaky. My concern is that I recover gracefully and that I do not crash Apache - the PHP set_time_limit() function only returns a fatal error. In addition, there is a note in the PHP manual that time spent on stream operations does not contribute to the running time of the script.

How can I recover from such connection problems and allow my script to continue? And why would this be causing Apache to hang?

Thanks, Brian

1条回答
够拽才男人
2楼-- · 2019-01-28 06:51
$options = array( 'http' => array(
      'user_agent'    => 'Firefox wannabe',
      'max_redirects' => 1,
      'timeout'       => 10,
  ) );
$context = stream_context_create( $options );
$content = file_get_contents( $url, false, $context );

Take a look at stream_context_create and HTTP Context Options. The above code will set a timeout on the connection, and will allow for one redirect.

This should prevent reaching the timeout.

The long delays may be caused by the network or by the remote server having a firewall denying you to grab too many files at once or by a flaky DNS server or router on the path to the remote host. As a suggestion, you should cache locally the downloaded files, so on the next refresh files will be handled locally instead of the big wide net.

查看更多
登录 后发表回答