return FALSE; $r = well_tag_thread__update(array('id' => $id), $update); return $r; } function well_tag_thread_find($tagid, $page, $pagesize) { $arr = well_tag_thread__find(array('tagid' => $tagid), array('id' => -1), $page, $pagesize); return $arr; } function well_tag_thread_find_by_tid($tid, $page, $pagesize) { $arr = well_tag_thread__find(array('tid' => $tid), array(), $page, $pagesize); return $arr; } ?>javascript - What happens when Postgresql Connection Pool is exhausted? - Stack Overflow
最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

javascript - What happens when Postgresql Connection Pool is exhausted? - Stack Overflow

programmeradmin2浏览0评论

I'm looking at sing pooled connections from NodeJs to Postgresql. I'll be using the Pool class in the pg library, along with async / await.

I've read that Postgresql by default has a limit of 100 concurrent connections and the Pool has a default of 10 pooled connections.

My app will scale up new instances as it es under heavy load, so I could theoretically end up with more than 10 instances, which would then exceed the 100 Postgresql max connections.

What I'd like to know is, what will happen when I execute await pool.Query(....) under the following circumstances.

  1. All 10 pooled connections are currently in use - will it await one to bee available or throw an exception?
  2. All 100 connections to the DB server are in use and NodeJS tries to create a new connection from a pool.

Also, how can I write some NodeJS code that will attempt to make 101 pooled connections in order to prove this behaviour?

I'm looking at sing pooled connections from NodeJs to Postgresql. I'll be using the Pool class in the pg library, along with async / await.

I've read that Postgresql by default has a limit of 100 concurrent connections and the Pool has a default of 10 pooled connections.

My app will scale up new instances as it es under heavy load, so I could theoretically end up with more than 10 instances, which would then exceed the 100 Postgresql max connections.

What I'd like to know is, what will happen when I execute await pool.Query(....) under the following circumstances.

  1. All 10 pooled connections are currently in use - will it await one to bee available or throw an exception?
  2. All 100 connections to the DB server are in use and NodeJS tries to create a new connection from a pool.

Also, how can I write some NodeJS code that will attempt to make 101 pooled connections in order to prove this behaviour?

Share Improve this question edited Jan 19, 2022 at 0:07 Bergi 666k161 gold badges1k silver badges1.5k bronze badges asked Jan 18, 2022 at 23:29 Peter MorrisPeter Morris 23.3k12 gold badges97 silver badges166 bronze badges 1
  • for (let i=0; i<11; i++) { const p = new Pool(); for (let j=0; j<11; j++) { p.query('SELECT true') } }? – Bergi Commented Jan 19, 2022 at 0:02
Add a ment  | 

1 Answer 1

Reset to default 5

When all connections in a pool are reached, a new requestor will just block until someone else finishes, unless connectionTimeoutMillis is set then it will get a synthetic error after the specified timeout. This is documented

When all PostgreSQL max_connection are exhausted (by multiple pools for example), then attempts to get a connection will fail and will fail back through to the requestor. This does not seem to be documented, and one could imagine the Pool being more clever, by for example intercepting the error and making the client wait as if max for that pool had been reached (But in that case, what if it were the first connection that that pool tried to make? What would it be waiting for?), or retrying the connection periodically for one to bee available.

So you would be well advised not to allow this to happen, by limiting how far the app server can scale or increasing max_connections in Postgres or lowering max in each pool.

Which of these makes sense depends on the circumstances. There is no point scaling the app server at all if the bottleneck is entirely in the database.

发布评论

评论列表(0)

  1. 暂无评论