Is there any sane way to make a HTTP request asynchronously in PHP without throwing out the response? I.e., something similar to AJAX - the PHP script initiates the request, does it's own thing and later, when the response is received, a callback function/method or another script handles the response.
One approach has crossed my mind - spawning a new php process with another script for each request - the second script does the request, waits for the response and then parses the data and does whatever it should, while the original script goes on spawning new processes. I have doubts, though, about performance in this case - there must be some performance penalty from having to create a new process every time.
Yes, depending on the traffic of your site, spawning a separate PHP process for running a script could be devastating. It would be more efficient to use shell_exec() to start a background process that saves the output to a filename you already know, but even this could be resource intensive.
You could also have a request queue stored in a database. A single, separate background process would pull the job, execute it, and save the output, possibly setting a flag in the DB that your web process could check.
If you're going to use the DB queue approach, use curl_multi* class of functions to send all queued requests at once. This will limit the execution time of each iteration in your background process to the longest request time.
V5 may not be threaded, but you can create applications that exploit in-process multitasking.
Check out the following article: "Develop multitasking applications with PHP V5" from
IBM DeveloperWorks. You can find it here http://www.ibm.com/developerworks/web/library/os-php-multitask/