I'm using Guzzle to fetch a large number of URLs in parallel (or asynchronously) using a pool:
$client = new GuzzleHttp\Client([
'base_url' => 'http://httpbin.org',
]);
$requests = [];
for ($i = 0; $i < 8; ++$i) {
$requests[] = $client->createRequest('GET', '/get');
}
$pool = new GuzzleHttp\Pool($client, $requests, [
'pool_size' => 4,
'complete' => function (GuzzleHttp\Event\CompleteEvent $event) {
var_dump($event->getRequest()->getUrl());
},
]);
$pool->wait();
var_dump(count($requests));
If I run the above in the console it displays the expected output:
string(22) "http://httpbin.org/get"
string(22) "http://httpbin.org/get"
string(22) "http://httpbin.org/get"
string(22) "http://httpbin.org/get"
string(22) "http://httpbin.org/get"
string(22) "http://httpbin.org/get"
string(22) "http://httpbin.org/get"
string(22) "http://httpbin.org/get"
int(8)
Now, I would like to be able to add additional requests to the same pool based on some condition, I believe this behavior is usually known as rolling [parallel] requests, but after reading and re-reading the documentation I haven't managed to figure it out. Here's something I tried:
$client = new GuzzleHttp\Client([
'base_url' => 'http://httpbin.org',
]);
$requests = [];
for ($i = 0; $i < 8; ++$i) {
$requests[] = $client->createRequest('GET', '/get');
}
$i = 0;
$pool = new GuzzleHttp\Pool($client, $requests, [
'pool_size' => 4,
'complete' => function (GuzzleHttp\Event\CompleteEvent $event) use (&$i, $client, &$requests) {
var_dump($event->getRequest()->getUrl());
if (++$i % 3 == 0) {
$requests[] = $client->createRequest('GET', '/ip');
}
},
]);
$pool->wait();
var_dump(count($requests));
Every third request to /get
should add a new request to /ip
, the $requests
array is actually growing (to 10 elements and not 11 as would be expected) but the requests are never really executed. Is there a way of making a Guzzle pool execute post-initialization requests?