Working with Nodejs and MongoDB through Node MongoDB native driver. Need to retrieve some documents, and make modification, then save them right back. This is an example:
db.open(function (err, db) {
db.collection('foo', function (err, collection) {
var cursor = collection.find({});
cursor.each(function (err, doc) {
if (doc != null) {
doc.newkey = 'foo'; // Make some changes
db.save(doc); // Update the document
} else {
db.close(); // Closing the connection
}
});
});
});
With asynchronous nature, if the process of updating the document takes longer, then when cursor reaches the end of documents, database connection is closed. Not all updates are saved to the database.
If the db.close()
is omitted, all the documents are correctly updated, but the application hangs, never exits.
I saw a post suggesting using a counter to track number of updates, when fall back to zero, then close the db. But am I doing anything wrong here? What is the best way to handle this kind of situation? Does db.close()
have to be used to free up resource? Or does a new db connection needs to open?
Here's a potential solution based on the counting approach (I haven't tested it and there's no error trapping, but it should convey the idea).
The basic strategy is: Acquire the count of how many records need to be updated, save each record asynchronously and a callback on success, which will decrement the count and close the DB if the count reaches 0 (when the last update finishes). By using
{safe:true}
we can ensure that each update is successful.The mongo server will use one thread per connection, so it's good to either a) close unused connections, or b) pool/reuse them.
I found that using counter may apply to simple scenario, but may be hard in complicated situations. Here is a solution that I come up by closing the database connection when database connection is idle:
This can be a general solution for any database Connections. maxDbIdleTime can be set as the same value as db query timeout or longer.
This is not very elegant, but I can't think of a better way to do this. I use NodeJs to run a script that queries MongoDb and Mysql, and the script hangs there forever if the database connections are not closed properly.
Based on the suggestion from @mpobrien above, I've found the async module to be incredibly helpful in this regard. Here's an example pattern that I've come to adopt:
It's best to use a pooled connection and then call db.close() in cleanup function at the end of your application's life:
See http://mongodb.github.io/node-mongodb-native/driver-articles/mongoclient.html
A bit old thread, but anyway.
Here's a solution I came up with. It avoids using toArray and it's pretty short and sweet:
I came up with a solution that involves a counter like this. It does not depend on a count() call nor does it wait for a time out. It will close the db after all the documents in each() are exhausted.
So that each time you are going to make a call like db.each() or db.save() you would use these methods to ensure the db is ready while working and closed when done.
Example from OP:
Now, this assumes that the second to last callback from each makes it through the mydb.open() before the last callback from each goes to mydb.close().... so, of course, let me know if this is an issue.
So: put a mydb.open(db) before a db call and put a mydb.close(db) at the return point of the callback or after the db call (depending on the call type).
Seems to me that this kind of counter should be maintained within the db object but this is my current workaround. Maybe we could create a new object that takes a db in the constructor and wrap the mongodb functions to handle the close better.