Code:
try {
$documentsFind = $client->$db->$collection->find([
// query
]);
if ($documentsFind) {
foreach ($documentsFind as $product) {
// code...
}
}
catch (MongoCursorException $e) {
echo "error message: ".$e->getMessage()."\n";
echo "error code: ".$e->getCode()."\n";
}
Error:
Fatal error: Uncaught MongoDB\Driver\Exception\RuntimeException:
Cursor not found, cursor id: 31837896248 in ...
It seems that the cursor does exist but times out? How can I prevent that from happening?
Edited to add: I tried doing:
if ($documentsFind) {
$documentsFind->immortal(true); // keep alive
foreach ($documentsFind as $product) {
// code...
}
}
But that results in Call to undefined method MongoDB\Driver\Cursor::immortal()
.
Try querying like this:
$documentsFind = $client->$db->$collection->find([
// query
], ['noCursorTimeout' => true]);
find()
method passes the 2nd argument to the Find
class constructor, so you can see all the available options here
Cursor exception says,
The driver was trying to fetch more results from the database, but the database did not have a record of the query. This usually means that the cursor timed out on the server side: after a few minutes of inactivity, the database will kill a cursor.
The MongoDB PHP driver has two different timeouts:
- Connection timeout
- Cursor timeout
Make sure you use timeout or immortal on the cursor:
$cursor = $collection->find();
$cursor->immortal(true);
$cursor->timeout(-1);
Note: Timeout indicates the time to wait on the client side while immortal sets the cursor on the server side.
But I would suggest if you have cursor of big data size then you should fetch cursor data in chunks like:
Get the first 1000 documents from collection, process them. Then get the next 1000 documents. You can do this by skip and limit.