Load balance request traffic with muiltiple Node s

2019-05-11 12:34发布

问题:

According to this answer:

You should run multiple Node servers on one box, 1 per core and split request traffic between them. This provides excellent CPU-affinity and will scale throughput nearly linearly with core count.

Got it, so let's say our box has 2 cores for simplicity.

I need a complete example a Hello World app being load balanced between two Node servers using NGINX.

This should include any NGINX configuration as well.

回答1:

app.js

var http = require('http');
var port = parseInt(process.argv[2]);

http.createServer(function (req, res) {
    res.writeHead(200, {'Content-Type': 'text/plain'});
    res.end('Hello World\n');
}).listen(port);

console.log('Server running at http://localhost:' + port + '/');

nginx configuration

upstream app  {
  server localhost:8001;
  server localhost:8002;
}

server {
  location / {
    proxy_pass  http://app;
  }
}

Launch your app

node app.js 8001
node app.js 8002

HttpUpstreamModule documentation

Additional reading material

  • cluster module - still experimental, but you don't need nginx
  • forever module - in case your app crashes
  • nginx and websockets - how to proxy websockets in the new nginx version


标签: node.js nginx