I have a PHP script that does a lot of checks and then downloads a file to the server and outputs the original.
To download the file I use
system("/usr/local/bin/wget -O ...");
So php waits for the download to finish and then it outputs the file.
My question is if its possible to send a "ping" somewhere (other php file?) for the download to start and then without waiting for the result just continue with the execution.
Use curl_multi_*
functions to handle multiple-downloads in parallel. PHP lacks any kind of threading support and process forking is merely a hack.
Use pcntl_fork()
to fork your current process into a child process. Do the file download in the child process while your parent process can continue executing its task.
$pid = pcntl_fork();
if ( $pid == -1 )
{
// Fork failed
exit(1);
}
else if ( $pid )
{
// The parent process
//continue what you want to do here
}
else
{
// the child process
// do the system call here
}
After doing some work, if you now need to parent process to wait until the child process finishes, you can pcntl_waitpid($pid)
.
Read here for more documentation on pcntl methods.
A general design patters would be to apply a Command Design Pattern: http://en.wikipedia.org/wiki/Command_pattern
You can implement this in your local web environment or you could install some kind of queue server like: http://www.rabbitmq.com/ etc. etc.
It depends on your systems, access to them and your knowledge to see which solution would fit best That's a more complex question too far offtopic here.
In general the suggested pattern will allow you to fix your question without having lots of issues since it is a well-known and widely used approach to handle lots of tasks which take too long for doing directly within the request.