最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

javascript - wait for one fetch to finish before starting the next - Stack Overflow

programmeradmin1浏览0评论

I have a list of data that I am sending to google cloud. My current code looks like this:

const teams = ['LFC', 'MUFC', 'CFC'];

teams.forEach(team => {
    fetch({
      url: URL,
      method: 'PUT',
      body: team
    });
})

This works with one team but it is timing out if sending multiple files and the files are bigger. I am sending images over and not strings. To solve this I need to POST the data one file by one, and wait for the previous POST to plete before sending the subsequent one. Can anyone advise the best way of doing this?

Worth noting that I don't have any control over the number of files that are uploaded.

I have a list of data that I am sending to google cloud. My current code looks like this:

const teams = ['LFC', 'MUFC', 'CFC'];

teams.forEach(team => {
    fetch({
      url: URL,
      method: 'PUT',
      body: team
    });
})

This works with one team but it is timing out if sending multiple files and the files are bigger. I am sending images over and not strings. To solve this I need to POST the data one file by one, and wait for the previous POST to plete before sending the subsequent one. Can anyone advise the best way of doing this?

Worth noting that I don't have any control over the number of files that are uploaded.

Share edited Mar 17, 2022 at 8:29 VLAZ 29.1k9 gold badges63 silver badges84 bronze badges asked Oct 21, 2019 at 18:59 peter flanaganpeter flanagan 9,84027 gold badges81 silver badges140 bronze badges 1
  • If you have access to the npm, you could install the bluebird and use Promise.reduce; Promise.reduce will execute an array of promises one by one as they resolved and allow you to "reduce" the previous results into one final result. – gabriel.hayes Commented Oct 21, 2019 at 19:04
Add a ment  | 

3 Answers 3

Reset to default 9

Use a reduce instead of forEach, with .then().

The following will store the promise of the last fetch in acc (the accumulator parameter of reduce), and appends the new fetch inside of a then listener, to ensure that the previous fetch is finished:

const teams = ['LFC', 'MUFC', 'CFC'];

teams.reduce((acc, team) => {
  return acc.then(() => {
    return fetch({
      url: URL,
      method: 'PUT',
      body: team
    });
  })
}, Promise.resolve())
  .then(() => console.log("Everything's finished"))
  .catch(err => console.error("Something failed:", err))

//Simulate fetch:
const fetch = team => new Promise(rs => setTimeout(() => {rs();console.log(team)}, 1000))

const teams = ['LFC', 'MUFC', 'CFC'];

teams.reduce((acc, team) => {
  return acc.then(() => {
    return fetch({
      url: URL,
      method: 'PUT',
      body: team
    });
  })
}, Promise.resolve())
  .then(() => console.log("Everything's finished"))
  .catch(err => console.error("Something failed:", err))

You can even write a general helper function for it:

const teams = ['LFC', 'MUFC', 'CFC'];

const promiseSeries = (arr, cb) => arr.reduce((acc, elem) => acc.then(() => cb(elem)), Promise.resolve())

promiseSeries(teams, (team) => {
  return fetch({
    url: URL,
    method: 'PUT',
    body: team
  })
})
  .then(() => console.log("Everything's finished"))
  .catch(err => console.error("Something failed:", err))

//Simulate fetch:
const fetch = team => new Promise(rs => setTimeout(() => {rs();console.log(team)}, 1000))

const teams = ['LFC', 'MUFC', 'CFC'];

const promiseSeries = (arr, cb) => arr.reduce((acc, elem) => acc.then(() => cb(elem)), Promise.resolve())

promiseSeries(teams, (team) => {
  return fetch({
    url: URL,
    method: 'PUT',
    body: team
  })
})
  .then(() => console.log("Everything's finished"))
  .catch(err => console.error("Something failed:", err))

Or, even better, if you can (it's ES2017), use async/await (it's more readable):

const teams = ['LFC', 'MUFC', 'CFC'];

async function upload(teams){
  for(const team of teams){
    await fetch({
      url: URL,
      method: 'PUT',
      body: team
    });
  }
}

upload(teams)
  .then(() => console.log("Everything's finished"))
  .catch(err => console.error("Something failed:", err))

//Simulate fetch:
const fetch = team => new Promise(rs => setTimeout(() => {rs();console.log(team)}, 1000))

const teams = ['LFC', 'MUFC', 'CFC'];

async function upload(teams) {
  for (const team of teams) {
    await fetch({
      url: URL,
      method: 'PUT',
      body: team
    });
  }
}

upload(teams)
  .then(() => console.log("Everything's finished"))
  .catch(err => console.error("Something failed:", err))

You can use async/await with a for...of loop. Each call will "hold" the loop, until it's done, and then the loop will continue the next call:

const teams = ['LFC', 'MUFC', 'CFC'];

async function send(teams) {
  for (const team of teams) {
    await fetch({
      url: URL,
      method: 'PUT',
      body: team
    });
  }
}

You can make use of async/await, as follows:

const teams = ['LFC', 'MUFC', 'CFC'];

teams.forEach(async (team) => {
    await fetch({
      url: URL,
      method: 'PUT',
      body: team
    });
})
发布评论

评论列表(0)

  1. 暂无评论