I'm writing a Node.js web server that uses a Postgres database. I used to connect on each new request like this:
app.get('/', function (req, res) {
pg.connect(pgconnstring, function (err, client) {
// ...
});
});
But after a few requests, I noticed 'out of memory' errors on Heroku when trying to connect. My database has only 10 rows, so I don't see how this could be happening. All of my database access is of this form:
client.query('SELECT * FROM table', function (err, result) {
if (err) {
res.send(500, 'database error');
return;
}
res.set('Content-Type', 'application/json');
res.send(JSON.stringify({ data: result.rows.map(makeJSON) }));
});
Assuming that the memory error was due to having several persistent connections to the database, I switched to a style I saw in several node-postgres
examples of connecting only once at the top of the file:
var client = new pg.Client(pgconnstring);
client.connect();
app.get('/', function (req, res) {
// ...
});
But now my requests hang (indefinitely?) when I try to execute a query after the connection is disrupted. (I simulated it by killing a Postgres server and bringing it back up.)
So how do I do one of these?
- Properly pool Postgres connections so that I can 'reconnect' every time without running out of memory.
- Have the global client automatically reconnect after a network failure.
I'm assuming you're using the latest version of node-postgres, in which the connection pooling has been greatly improved. You must now check the connection back into the pool, or you'll bleed the connections:
As for error handling on a global connection (#2, but I'd use the pool):
The "missing" docs for all this is on the GitHub wiki.