最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

javascript - Limited parallelism with asyncawait in TypescriptES7 - Stack Overflow

programmeradmin4浏览0评论

I've been experimenting a bit with Typescript, but I'm now a bit stuck on how to use async/await effectively.

I'm inserting a bunch of records into a database, and I need to get the list of IDs that are returned by each insert. The following simplified example works in general, but it is not quite as elegant as I'd like and it is pletely sequential.

async function generatePersons() {
    const names = generateNames(firstNames, lastNames);
    let ids = []
    for (let name of names) {
        const id = await db("persons").insert({
            first_name: name.firstName,
            last_name: name.lastName,
        }).returning('id');
        ids.push(id[0])
    }
    return ids
}

I tried to use map to avoid creating the ids list manually, but I could get this to work.

What I'd also like to have is a limited amount of parallelism. So my asynchronous calls should happen in parallel up to a certain limit, e.g. I'd only ever like to have 10 open requests, but not more.

Is there a reasonably elegant way of achieving this kind of limited parallelism with async/await in Typescript or Javascript ES7? Or am I trying to get this feature to do something it was not intended for?

PS: I know there are bulk insert methods for databases, this example is a bit artificial as I could use those to work around this specific problem. But it made me wonder about the general case where I don't have predefined bulk methods available, e.g. with network requests

I've been experimenting a bit with Typescript, but I'm now a bit stuck on how to use async/await effectively.

I'm inserting a bunch of records into a database, and I need to get the list of IDs that are returned by each insert. The following simplified example works in general, but it is not quite as elegant as I'd like and it is pletely sequential.

async function generatePersons() {
    const names = generateNames(firstNames, lastNames);
    let ids = []
    for (let name of names) {
        const id = await db("persons").insert({
            first_name: name.firstName,
            last_name: name.lastName,
        }).returning('id');
        ids.push(id[0])
    }
    return ids
}

I tried to use map to avoid creating the ids list manually, but I could get this to work.

What I'd also like to have is a limited amount of parallelism. So my asynchronous calls should happen in parallel up to a certain limit, e.g. I'd only ever like to have 10 open requests, but not more.

Is there a reasonably elegant way of achieving this kind of limited parallelism with async/await in Typescript or Javascript ES7? Or am I trying to get this feature to do something it was not intended for?

PS: I know there are bulk insert methods for databases, this example is a bit artificial as I could use those to work around this specific problem. But it made me wonder about the general case where I don't have predefined bulk methods available, e.g. with network requests

Share Improve this question asked Aug 28, 2016 at 20:20 Mad ScientistMad Scientist 18.6k13 gold badges87 silver badges113 bronze badges 7
  • "Parallelism" is when 2 execution context run simultaneously (possibly on 2 different putation units). You cannot get that in JS. – zerkms Commented Aug 29, 2016 at 0:53
  • possible duplicate of Slowdown due to non-parallel awaiting of promises? – Bergi Commented Aug 29, 2016 at 1:01
  • Have a look at Using async/await with a forEach loop – Bergi Commented Aug 29, 2016 at 1:02
  • Regarding limiting concurrency, that should be a separate question. Have a look at this though (hint: there's no elegant way in native promises) – Bergi Commented Aug 29, 2016 at 1:04
  • Are any of the answers below acceptable solutions? – Daniel Rosenwasser Commented Nov 25, 2016 at 14:36
 |  Show 2 more ments

4 Answers 4

Reset to default 10

Psst, there's a package that does this for you on npm called p-map


Promise.all will allow you to wait for all requests to stop finishing, without blocking their creation.

However, it does sound like you want to block sometimes. Specifically, it sounded like you wanted to throttle the number of requests in flight at any given time. Here's something I whipped up (but haven't fully tested!)

async function asyncThrottledMap<T, U>(maxCount: number, array: T[], f: (x: T) => Promise<U>) {
    let inFlight = new Set<Promise<U>>();
    const result: Promise<U>[] = [];

    // Sequentially add a Promise for each operation.
    for (let elem of array) {

        // Wait for any one of the promises to plete if there are too many running.
        if (inFlight.size >= maxCount) {
            await Promise.race(inFlight);
        }

        // This is the Promise that the user originally passed us back.
        const origPromise = f(elem);
        // This is a Promise that adds/removes from the set of in-flight promises.
        const handledPromise = wrap(origPromise);
        result.push(handledPromise);
    }

    return Promise.all(result);

    async function wrap(p: Promise<U>) {
        inFlight.add(p);
        const result = await p;
        inFlight.delete(p);
        return result;
    }
}

Above, inFlight is a set of operations that are currently taking place.

The result is an array of wrapped Promises. Each of those wrapped promises basically adds or removes operations from the set of inFlight operations. If there are too many in-flight operations, then this uses Promise.race for any one of the in-flight operations to plete.

Hopefully that helps.

Checkout the async-parallel library which offers various helper functions that make it easy to perform parallel operations. Using this library your code could look something like this...

async function generatePersons(): Promise<number[]> {
    const names = generateNames(firstNames, lastNames);
    return await Parallel.map(names, async (name) => 
        await db("persons").insert({
            first_name: name.firstName,
            last_name: name.lastName,
        }).returning('id'));
}

If you want to limit the number of instances to say four at-a-time you can simply do the following...

Parallel.concurrency = 4;

I wrote the @watchable/nevermore package because I needed concurrency-control like this.

It turns your async function into a function which schedules its execution within limits (and optionally retries execution for failure).

For the simplest case you could just configure concurrency...

   import { createExecutorStrategy } from "@watchable/nevermore";
   const { createExecutor } = createExecutorStrategy({
     concurrency: 10
   });
   
   async function myFn(myArgs){
      // do something async here
   }

   // does everything myFn does, but wrapped in scheduling
   const myExecutor = createExecutor(myFn);

If your implementation uses Promise.all around calls to myExecutor your requests will still be run in order and within the concurrency limits, but of course you'll have to wait for them all to resolve before getting the result because of the behaviour of Promise.all.

The package also has support for rate-limits, retry, backoff, timeout, so you could go down the route of respecting upstream load (no more than 20 per second) and adding some robustness for failures like...

   // add rate limits, retries and exponential backoff
   const { createExecutor } = createExecutorStrategy({
     concurrency: 10,
     intervalMs: 50,
     backoffMs: 100,
     retries: 3,
   })

There is API documentation at https://watchable.dev/api/modules/_watchable_nevermore.html

Is there a reasonably elegant way of achieving this kind of limited parallelism with async/await in Typescript or Javascript ES7

You will have to use Promise.all. i.e. collect all the promises in an array and await Promise.all([all,the,stuff]).

More

https://developer.mozilla/en/docs/Web/JavaScript/Reference/Global_Objects/Promise/all

发布评论

评论列表(0)

  1. 暂无评论