最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

javascript - Repetitive Node.js requests to MongoDB slow down eventually - Stack Overflow

programmeradmin1浏览0评论

I have data going into a MongoDB collection rmc and it's being upserted, i.e. I have one point with the latest latitude and longitude for my device.

From Node.js, I'd like to query that collection every 100 ms (to simulate real time) and update a map with the updated latitude / longitude.

I get good performance at first, but right after having updated data in my collection or just a while, the performance starts getting really bad.

What am I doing wrong? Could I do things in a better way? I can't seem to figure out if MongoDB or Node, or Mongoose.

The user goes to index.html, which fetches an HTML page. Within the HTML, I request a page every 100 ms:

function updateData() {
  $.getJSON("/data", function(json) {
    dosomething()
    interval = setTimeout(updateData, 100);
  })
};

updateData();

And my index.js:

/* GET home page. */
router.get('/', function(req, res, next) {
  res.render('index', { title: "test" });
});

router.get('/data', function(req, res, next) {
  var db = req.db;
  Json.find({}).select({}).lean().exec(function(e,docs){
    res.json(docs);
  });
});

Things are good and all of a sudden, there is huge delay being experienced:

GET /data 304 1.644 ms - -
GET /data 304 1.738 ms - -
GET /data 304 1.685 ms - -
GET /data 304 1.693 ms - -
GET /data 304 1.624 ms - -
GET /data 304 1.645 ms - -
GET /data 304 1.873 ms - -
GET /data 304 1.607 ms - -
GET /data 304 1.638 ms - -
GET /data 304 1.610 ms - -
GET /data 304 1.734 ms - -
GET /data 304 1.736 ms - -
GET /data 304 1.660 ms - -
GET /data 304 1.634 ms - -
GET /data 304 15.265 ms - -
GET /data 304 10.535 ms - -
GET /data 304 1.740 ms - -
GET /data 304 70.184 ms - -
GET /data 304 69.037 ms - -
GET /data 304 58.620 ms - -
GET /data 304 75.053 ms - -
GET /data 304 72.292 ms - -
GET /data 304 92.447 ms - -
GET /data 304 95.270 ms - -
GET /data 304 448.057 ms - -
GET /data 304 567.309 ms - -
GET /data 304 683.199 ms - -
GET /data 304 731.952 ms - -
GET /data 304 1102.502 ms - -
GET /data 304 1770.029 ms - -
GET /data 304 1051.307 ms - -
GET /data 304 1059.791 ms - -

I have data going into a MongoDB collection rmc and it's being upserted, i.e. I have one point with the latest latitude and longitude for my device.

From Node.js, I'd like to query that collection every 100 ms (to simulate real time) and update a map with the updated latitude / longitude.

I get good performance at first, but right after having updated data in my collection or just a while, the performance starts getting really bad.

What am I doing wrong? Could I do things in a better way? I can't seem to figure out if MongoDB or Node, or Mongoose.

The user goes to index.html, which fetches an HTML page. Within the HTML, I request a page every 100 ms:

function updateData() {
  $.getJSON("/data", function(json) {
    dosomething()
    interval = setTimeout(updateData, 100);
  })
};

updateData();

And my index.js:

/* GET home page. */
router.get('/', function(req, res, next) {
  res.render('index', { title: "test" });
});

router.get('/data', function(req, res, next) {
  var db = req.db;
  Json.find({}).select({}).lean().exec(function(e,docs){
    res.json(docs);
  });
});

Things are good and all of a sudden, there is huge delay being experienced:

GET /data 304 1.644 ms - -
GET /data 304 1.738 ms - -
GET /data 304 1.685 ms - -
GET /data 304 1.693 ms - -
GET /data 304 1.624 ms - -
GET /data 304 1.645 ms - -
GET /data 304 1.873 ms - -
GET /data 304 1.607 ms - -
GET /data 304 1.638 ms - -
GET /data 304 1.610 ms - -
GET /data 304 1.734 ms - -
GET /data 304 1.736 ms - -
GET /data 304 1.660 ms - -
GET /data 304 1.634 ms - -
GET /data 304 15.265 ms - -
GET /data 304 10.535 ms - -
GET /data 304 1.740 ms - -
GET /data 304 70.184 ms - -
GET /data 304 69.037 ms - -
GET /data 304 58.620 ms - -
GET /data 304 75.053 ms - -
GET /data 304 72.292 ms - -
GET /data 304 92.447 ms - -
GET /data 304 95.270 ms - -
GET /data 304 448.057 ms - -
GET /data 304 567.309 ms - -
GET /data 304 683.199 ms - -
GET /data 304 731.952 ms - -
GET /data 304 1102.502 ms - -
GET /data 304 1770.029 ms - -
GET /data 304 1051.307 ms - -
GET /data 304 1059.791 ms - -
Share edited Jun 26, 2018 at 13:18 user7637745 9852 gold badges14 silver badges27 bronze badges asked May 30, 2016 at 3:41 Stephane MaarekStephane Maarek 5,35210 gold badges51 silver badges90 bronze badges 10
  • so I think node.js / mongoose keeps some kind of artifacts in memory after each call and therefore my server quickly saturates. Is there any way I could clean the memory while the script is executing? – Stephane Maarek Commented May 30, 2016 at 4:34
  • I'm getting much better performance with mongoskin, but still, after a bit, something fills up and the behaviour above is found. FYI I pletely mented my code and only trigger calls to the /data route to troubleshoot – Stephane Maarek Commented May 30, 2016 at 4:56
  • 5 Not an answer to your question, but you should consider using Redis cache instead of querying the same data over and over again, you'll get much better performance. I don't see anything wrong with your code however and that slow down shouldn't happen. – NGPixel Commented Oct 15, 2016 at 21:44
  • 2 Could you post CPU and RAM usage over time as well? For the node and mongodb processes. – Prashanth Chandra Commented Nov 15, 2016 at 0:12
  • 3 You should try and replicate the problem with the Node.js MongoDB driver... if the problem doesn't happen at least you will know that the problem is related to Mongoose. – Ashley Davis Commented Mar 18, 2018 at 22:22
 |  Show 5 more ments

4 Answers 4

Reset to default 1

There might be different ways for profiling the node application to find out where the issue e from. Using native node you can try running you app:

 node --prof <entrypoint>.js

Then try to run your test to allow node to gather some data, when finished a file will be generated that has to be converted to something with a better human friendly output

node --prof-process <file_generate>.log > processed.txt

Getting into processed.txt you can find which parts of your code spend more CPU time so can point you to potentially the place where the issue might e from.

For more detailed info, there is a quite good entry in node docs about analyzing profiling files https://nodejs/en/docs/guides/simple-profiling/

I think you are piping the output to many functions which don't do much. Also, if the output is small then it seems okay to send all the data as response. Try modifying your code to

const docs = await Json.find({})
res.json(docs)

You don't need lean() or select(). They are doing nothing

You must see the execution time of the query in your code and together with the full code (can use console statements for this).And see whether its the query that takes up the time or any other piece of code.At same time moniter the cpu usage and see which process is taking how much resources. You can see the execution stats of any mongodb query https://docs.mongodb./manual/reference/operator/meta/explain/

Additionally run mongoose in debug mode in development.

mongoose.set('debug', true);

To increase your mongoDB performance you can do these 2 things:

  1. Adjusting connection pools; this way your code will re-use the existing connections than creating new connections everytime you do some operations, by default its 5 try scaling it.

  2. Creating indexes in your collection; create an unique/ pound index, this will for sure increase your query performance.

Additionally you can run explain mand to see your entire query execution and profiling.

Check this out: MongoDB Performance

发布评论

评论列表(0)

  1. 暂无评论