Found this issue debugging code where the following did not work:
var req = http.request(options,function(res){
// res.on('error',cb(err));
res.on('end',function(){
cb();
});
});
However the following did work:
var req = http.request(options,function(res){
// res.on('error',cb(err));
res.on('data',function(chunk){
//why do we need this?
});
res.on('end',function(){
cb();
});
});
Found this issue debugging code where the following did not work:
var req = http.request(options,function(res){
// res.on('error',cb(err));
res.on('end',function(){
cb();
});
});
However the following did work:
var req = http.request(options,function(res){
// res.on('error',cb(err));
res.on('data',function(chunk){
//why do we need this?
});
res.on('end',function(){
cb();
});
});
Share
Improve this question
asked Mar 17, 2015 at 23:42
RichardBRichardB
976 bronze badges
2
-
because of what the documentation tells you over at nodejs/api/http.html#http_class_http_clientrequest -- that said, using the base Node.js
http
andhttps
modules is generally far less useful than using one of the better engineered web server APIs on top of those like Express, Hapi, and many others. – Mike 'Pomax' Kamermans Commented Mar 17, 2015 at 23:47 -
one handy workaround:
res.on('data', Boolean)
– dandavis Commented Mar 18, 2015 at 0:12
3 Answers
Reset to default 5The res
variable is a Readable Stream. If you click the link and scroll down to 'end' event you might find the following:
Note that the 'end' event will not fire unless the data is pletely consumed.
By adding the 'data' event handler, you consume the data.
It's how node's backpressure mechanism works. If the response stream's buffer fills up, it tells the server to stop sending data (this is handled at the TCP layer). So once you start reading data (via res.read()
or attaching a data
handler or simply using res.resume()
), more data from the server is transferred until some point in which there is no more data. Only once the server has no more data to send will you get an end
event. I typically use res.resume();
since it's a lot shorter.
This behavior has existed since node v0.10. Before that, you could actually lose data if you did not attach a data
handler right away, so as you can imagine that caused a problem for a lot of people. So with node v0.10+ the default behavior is to pause until you start reading (this is at the node streams layer, separate from the network).
In this implementation, function(res)
is a callback which must register listeners for at least the 'data'
and 'end'
events, even if they don't actually do anything.
from the official documentation:
During the 'response' event, one can add listeners to the response object; particularly to listen for the 'data' event.
If no 'response' handler is added, then the response will be entirely discarded. However, if you add a 'response' event handler, then you must consume the data from the response object, either by calling response.read() whenever there is a 'readable' event, or by adding a 'data' handler, or by calling the .resume() method. Until the data is consumed, the 'end' event will not fire. Also, until the data is read it will consume memory that can eventually lead to a 'process out of memory' error.