Asynchronous PHP?

2019-08-21 06:44发布

I have a PHP script that does a lot of checks and then downloads a file to the server and outputs the original.
To download the file I use

system("/usr/local/bin/wget -O ...");

So php waits for the download to finish and then it outputs the file.
My question is if its possible to send a "ping" somewhere (other php file?) for the download to start and then without waiting for the result just continue with the execution.

标签: php linux system
3条回答
够拽才男人
2楼-- · 2019-08-21 07:11

Use curl_multi_* functions to handle multiple-downloads in parallel. PHP lacks any kind of threading support and process forking is merely a hack.

查看更多
再贱就再见
3楼-- · 2019-08-21 07:13

Use pcntl_fork() to fork your current process into a child process. Do the file download in the child process while your parent process can continue executing its task.

$pid = pcntl_fork(); 

if ( $pid == -1 ) 
{        
    // Fork failed            
    exit(1); 
} 
else if ( $pid ) 
{ 
    // The parent process
   //continue what you want to do here
} 
else 
{ 
    // the child process 
    // do the system call here
}

After doing some work, if you now need to parent process to wait until the child process finishes, you can pcntl_waitpid($pid).

Read here for more documentation on pcntl methods.

查看更多
孤傲高冷的网名
4楼-- · 2019-08-21 07:17

A general design patters would be to apply a Command Design Pattern: http://en.wikipedia.org/wiki/Command_pattern

You can implement this in your local web environment or you could install some kind of queue server like: http://www.rabbitmq.com/ etc. etc.

It depends on your systems, access to them and your knowledge to see which solution would fit best That's a more complex question too far offtopic here.

In general the suggested pattern will allow you to fix your question without having lots of issues since it is a well-known and widely used approach to handle lots of tasks which take too long for doing directly within the request.

查看更多
登录 后发表回答