最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

javascript - Lots of parallel http requests in node.js - Stack Overflow

programmeradmin0浏览0评论

I've created a node.js script, that scans network for available HTTP pages, so there is a lot of connections i want to run in parallel, but it seems that some of the requests wait for previous to complete.

Following is the code fragment:

    var reply = { };
    reply.started = new Date().getTime();
    var req = http.request(options, function(res) {
        reply.status = res.statusCode;
        reply.rawHeaders = res.headers;
        reply.headers = JSON.stringify(res.headers);
        reply.body = '';
        res.setEncoding('utf8');
        res.on('data', function (chunk) {
            reply.body += chunk;
        });
        res.on('end', function () {
            reply.finished = new Date().getTime();
            reply.time = reply.finished - reply.started;
            callback(reply);
        });
    });
    req.on('error', function(e) {
        if(e.message == 'socket hang up') {
            return;
        }
        errCallback(e.message);
    });
    req.end();

This code performs only 10-20 requests per second, but i need 500-1k requests performance. Every queued request is made to a different HTTP server.

I've tried to do something like that, but it didn't help:

    http.globalAgent.maxSockets = 500;

I've created a node.js script, that scans network for available HTTP pages, so there is a lot of connections i want to run in parallel, but it seems that some of the requests wait for previous to complete.

Following is the code fragment:

    var reply = { };
    reply.started = new Date().getTime();
    var req = http.request(options, function(res) {
        reply.status = res.statusCode;
        reply.rawHeaders = res.headers;
        reply.headers = JSON.stringify(res.headers);
        reply.body = '';
        res.setEncoding('utf8');
        res.on('data', function (chunk) {
            reply.body += chunk;
        });
        res.on('end', function () {
            reply.finished = new Date().getTime();
            reply.time = reply.finished - reply.started;
            callback(reply);
        });
    });
    req.on('error', function(e) {
        if(e.message == 'socket hang up') {
            return;
        }
        errCallback(e.message);
    });
    req.end();

This code performs only 10-20 requests per second, but i need 500-1k requests performance. Every queued request is made to a different HTTP server.

I've tried to do something like that, but it didn't help:

    http.globalAgent.maxSockets = 500;
Share Improve this question edited Aug 17, 2016 at 6:35 Bilesh Ganguly 4,1313 gold badges38 silver badges63 bronze badges asked Jun 28, 2013 at 19:39 druidvavdruidvav 1291 gold badge1 silver badge7 bronze badges 9
  • 2 It looks like you're making HTTP requests. Is it even possible to get that many requests that fast over an internet connection? I have an extremely fast connection here, but my ping to the nearest server is about 52ms, which I think means that I could make about 20 HTTP requests per second. – Robert Harvey Commented Jun 28, 2013 at 19:45
  • i am running this script on a machine, that iam sure can handle this lot of requests. to be precise: it is hetzner 6s server. – druidvav Commented Jun 28, 2013 at 19:48
  • Sure, but did you read what I said? I don't think you can make that many requests over HTTP with a single internet connection, no matter how powerful your machine is. When you make an HTTP request, you have to wait for a response from the other end. You can certainly service more requests than that, but that's because you would be servicing requests from many browsers, each with their own internet connection. – Robert Harvey Commented Jun 28, 2013 at 19:49
  • Here is output of popular http-server testing tool: # ab -n 10000 -c 1000 srv2.itrack.ru / Requests per second: 914.94 [#/sec] (mean) Time per request: 1092.968 [ms] (mean) – druidvav Commented Jun 28, 2013 at 19:54
  • So you're queuing up requests in node.js then? Waiting for the responses? You'd have to be, and since it takes 1 second to process each request, you'd need 914 live threads in node.js to make it work. – Robert Harvey Commented Jun 28, 2013 at 19:55
 |  Show 4 more comments

2 Answers 2

Reset to default 11

Something else must be going on with your code. Node can comfortably handle 1k+ requests per second.

I tested with the following simple code:

var http = require('http');

var results = [];
var j=0;

// Make 1000 parallel requests:
for (i=0;i<1000;i++) {
    http.request({
        host:'127.0.0.1',
        path:'/'
    },function(res){
        results.push(res.statusCode);
        j++;

        if (j==i) { // last request
            console.log(JSON.stringify(results));
        }
    }).end();
}

To purely test what node is capable of and not my home broadband connection the code requests from a local Nginx server. I also avoid console.log until all the requests have returned because it is implemented as a synchronous function (to avoid losing debugging messages when a program crash).

Running the code using time I get the following results:

real    0m1.093s
user    0m0.595s
sys     0m0.154s

That's 1.093 seconds for 1000 requests which makes it very close to 1k requests per second.


The simple code above will generate OS errors if you try to make a lot of requests (like 10000 or more) because node will happily try to open all those sockets in the for loop (remember: the requests don't start until the for loop ends, they are only created). You mentioned that your solution also runs into the same errors. To avoid this you should limit the number of parallel requests you make.

The simplest way of limiting number of parallel requests is to use one of the Limit functions form the async.js library:

var http = require('http');
var async = require('async');

var requests = [];

// Build a large list of requests:
for (i=0;i<10000;i++) {
    requests.push(function(callback){
        http.request({
            host:'127.0.0.1',
            path:'/'
        },function(res){
            callback(null,res.statusCode);
        }).end()
    });
}

// Make the requests, 100 at a time
async.parallelLimit(requests, 100,function(err, results){
    console.log(JSON.stringify(results));
});

Running this with time on my machine I get:

real    0m8.882s
user    0m4.036s
sys     0m1.569s

So that's 10k request in around 9 seconds or roughly 1.1k/s.

Look at the functions available from async.js.

I've found solution for me, it is not very good, but works:

childProcess = require('child_process')

I'm using curl:

childProcess.exec('curl --max-time 20 --connect-timeout 10 -iSs "' + options.url + '"', function (error, stdout, stderr) { }

This allows me to run 800-1000 curl processes simultaneously. Of course, this solution has it's weekneses, like requirement for lots of open file decriptors, but works.

I've tried node-curl bindings, but that was very slow too.

发布评论

评论列表(0)

  1. 暂无评论