根据这个 answer: You should run multiple Node servers on one box, 1 per core and split request traffic between them. This provides excellent CPU-affinity and will scale throughput nearly linearly with core count. 知道了,所以让我们
You should run multiple Node servers on one box, 1 per core and split request traffic between them. This provides excellent CPU-affinity and will scale throughput nearly linearly with core count.
知道了,所以让我们说我们的盒子有2个核心以简化.
我需要一个完整的例子,一个Hello World应用程序使用NGINX在两个Node服务器之间进行负载平衡.
这应包括任何NGINX配置.
app.jsvar http = require('http');
var port = parseInt(process.argv[2]);
http.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/plain'});
res.end('Hello World\n');
}).listen(port);
console.log('Server running at http://localhost:' + port + '/');
nginx配置
upstream app {
server localhost:8001;
server localhost:8002;
}
server {
location / {
proxy_pass http://app;
}
}
启动您的应用
node app.js 8001 node app.js 8002
HttpUpstreamModule documentation
附加阅读材料
> cluster module – 仍然是实验性的,但你不需要nginx
> forever module – 万一您的应用程序崩溃
> nginx and websockets – 如何在新的nginx版本中代理websockets
