PHP Concurrent HTTP requests?

2020-04-12 09:45发布

问题:

I was wondering what the best way to do concurrent HTTP requests in PHP? I have a lot of data to get and i'd rather do multiple requests at once to retrieve it all.

Does anybody know how I can do this? Preferably in an anonymous/callback function mannor...

Thanks,

Tom.

回答1:

You can use curl_multi, which internally fires off multiple separate requests under a single curl handle.

But otherwise PHP itself not in any way/shape/form "multithreaded" and will not allow things to run in parallel, except via gross hacks (multiple parallel scripts, one script firing up multiple background tasks via exec(), etc...).



回答2:

You can try either curl_multi() or use a lower level function socket_select()



回答3:

Or if you want you can send you data as json. In php you can defragment it into all the values again. for eg.

xhttp.open("GET", "gotoChatRoomorNot.php?q=[{"+str+"},{"+user1+"},{"+user2"}]", true);

and in php you can follow this to get your data back: How do I extract data from JSON with PHP?

So make a string in json format and send the entire thing through http. I think you can perform the same kind of behaviour with xml, but i am not aware of xml



回答4:

you can use HttpRequestPool http://www.php.net/manual/de/httprequestpool.construct.php

$multiRequests = array(
  new HttpRequest('http://www.google.com', HttpRequest::METH_GET),
  new HttpRequest('http://www.yahoo.com', HttpRequest::METH_GET)
  new HttpRequest('http://www.bing.com', HttpRequest::METH_GET)
);

$pool = new HttpRequestPool();
foreach ($multiRequests as $request)
{
  $pool->attach($request);
}

$pool->send();

foreach($pool as $request) 
{
  echo $request->getResponseBody();
}