When using Promise.all
with asynchronous code (in case of synchronous code, there is nothing to worry about), you can suffer from severe performance (if not other kinds of) issues, when you want to send out a whole bunch (be it tens, hundreds, thousands or even millions) of requests, given the receiving end of your asynchronous operations (e.g. the local filesystem, an HTTP server, a database, etc etc.) does not gracefully handle that many parallel requests.
For that case, it would be perfect if we could tell Promise.all
up to how many promises we want to have in-flight simultaneously. However, since A+ is supposed to be lean, adding these sorts of fancy features would certainly not make sense.
So what would be a better way of achieving this?
When using Promise.all
with asynchronous code (in case of synchronous code, there is nothing to worry about), you can suffer from severe performance (if not other kinds of) issues, when you want to send out a whole bunch (be it tens, hundreds, thousands or even millions) of requests, given the receiving end of your asynchronous operations (e.g. the local filesystem, an HTTP server, a database, etc etc.) does not gracefully handle that many parallel requests.
For that case, it would be perfect if we could tell Promise.all
up to how many promises we want to have in-flight simultaneously. However, since A+ is supposed to be lean, adding these sorts of fancy features would certainly not make sense.
So what would be a better way of achieving this?
Share Improve this question asked Apr 18, 2015 at 19:08 DomiDomi 24.6k19 gold badges102 silver badges132 bronze badges3 Answers
Reset to default 8Well, first of all - it's impossible to give a concurrency argument to Promise.all
since promises represent already started operations so you cannot queue them or make the wait before executing.
What you want to perform with limited concurrency is promise returning functions. Lucky for you - bluebird ships with this feature (As of version 2.x) using Promise.map
:
Promise.map(largeArray, promiseReturningFunction, {concurrency: 16});
The concurrency parameter decides how many operations may happen at once - note that this is not a global value - but only for this chain. For example:
Promise.map([1,2,3,4,5,6,7,8,9,10], function(i){
console.log("Shooting operation", i);
return Promise.delay(1000);
}, {concurrency: 2});
Fiddle
Note that execution order is not guaranteed.
Since I was not able to find a pre-existing library to take care of promise batching, I wrote a simple primitive myself. It is a class that wraps an array of function
s to be executed, and partitions it into batches of given size. It will wait for each batch to finish before running the next. It is a rather naive implementation. A full-blown throttling mechanism would probably be desirable in some networking scenarios.
Fiddle.
Code:
/**
* Executes any number of functions, one batch at a time.
*
* @see http://jsfiddle/93z8L6sw/2
*/
var Promise_allBatched = (function() {
var Promise_allBatched = function(arr, batchSize) {
if (arr.length == 0) return Promise.resolve([]);
batchSize = batchSize || 10;
var results = [];
return _runBatch(arr, batchSize, results, 0)
.return(results); // return all results
};
function _runBatch(arr, batchSize, results, iFrom) {
// run next batch
var requests = [];
var iTo = Math.min(arr.length, iFrom + batchSize);
for (var i = iFrom; i < iTo; ++i) {
var fn = arr[i];
var request;
if (fn instanceof Function) {
request = fn();
}
else {
request = fn;
}
requests.push(request); // start promise
}
return Promise.all(requests) // run batch
.then(function(batchResults) {
results.push.apply(results, batchResults); // store all results in one array
console.log('Finished batch: ' + results.length + '/' + arr.length);
})
.then(function() {
if (iTo < arr.length) {
// keep recursing
return _runBatch(arr, batchSize, results, iTo);
}
});
}
return Promise_allBatched;
})();
let arry=[1,2,4,........etc];//large number of array
while (arry.length) {
//I splice the arry 0,10...so you can change the limit as your //wish(note your server should be handle your limit
Promise.all(arry.splice(0, 10).map(async (eachArryElement) => {
let res = await yourAsyncMethod(eachArryElement);
}))
}
function yourAsyncMethod(data){
return new Promise((resolve, reject) => {
//your logic
resolve('your output')
})
}