I am working on an a PHP web application and i need to perform some network operations in the request like fetching someone from remote server based on user's request.
Is it possible to simulate asynchronous behavior in PHP given that i have to pass some data to a function and also need output from it.
My code is like:
<?php
$data1 = processGETandPOST();
$data2 = processGETandPOST();
$data3 = processGETandPOST();
$response1 = makeNetworkCall($data1);
$response2 = makeNetworkCall($data2);
$response3 = makeNetworkCall($data3);
processNetworkResponse($response1);
processNetworkResponse($response2);
processNetworkResponse($response3);
/*HTML and OTHER UI STUFF HERE*/
exit;
?>
Each network operation takes around 5 seconds to complete adding a total of 15 seconds to the response time of my application given i make 3 requests.
The makeNetworkCall() function just do a HTTP POST request.
The remote server is an 3rd party API so i don't have any control over there.
PS: Please do not answer giving suggestions about AJAX or Other things. I am currently looking if i can do this through PHP may be with an C++ extension or something like that.
There is also http v2 which is a wrapper for curl. Can be installed via pecl.
http://devel-m6w6.rhcloud.com/mdref/http/
Nowadays, it's better to use queues than threads (for those who don't use Laravel there are tons of other implementations out there like this).
The basic idea is, your original PHP script puts tasks or jobs into a queue. Then you have queue job workers running elsewhere, taking jobs out of the queue and starts processing them independently of the original PHP.
The advantages are:
cURL is going to be your only real choice here (either that, or using non-blocking sockets and some custom logic).
This link should send you in the right direction. There is no asynchronous processing in PHP, but if you're trying to make multiple simultaneous web requests, cURL multi will take care of that for you.
I think if the HTML and other UI stuff needs the data returned then there is not going to be a way to async it.
I believe the only way to do this in PHP would be to log a request in a database and have a cron check every minute, or use something like Gearman queue processing, or maybe exec() a command line process
In the meantime you php page would have to generate some html or js that makes it reload every few seconds to check on progress, not ideal.
To sidestep the issue, how many different requests are you expecting? Could you download them all automatically every hour or so and save to a database?
I dont have a direct answer, but you might want to look into these things:
job
is done, insert a new job describing the work that needs to be done in order to process the cached HTTP response body.