Execute many promises sequentially (Concept)

2019-08-25 03:02发布

问题:

(My target is clarify my concept about the problem, not code)

I want execute a array of promises sequentially, but nodeJS throw a strange error about many promises executed in parallel.(Because i limited that array to 20 promises and works, 50 promises and works, but 9000 promises and explode..)

  • I know that we have some solutions like, array.reduce(), loops, etc
  • I know about the promises states (my array have pending promises initially)

My question: I can execute 20 promises, then another 20 promises, etc but... If im executing my promises sequentially, nodeJS must execute 9k promises without problem? I have a bad concept? My code is wrong?

(Im doubting because nodeJS wait some time before begin to resolve the promises)

My case: i trying download 9k+ images (with axios), then save each one and then wait 5 seconds sequentially. [download 1 image, save that image, wait 5 seconds, then download next image, save.., wait..., etc.. ] Possible?

回答1:

I would have used something like a worker pool instead of executing things in a batch of 20 each time, you will always end up waiting for the last one to finish before you start next 20 batch, instead you should set a limit of how many continious download you want to do so you have no more then 20 promises and not a long chain of 9000

The same thing can be accomplish with iterators also. (a same iterator can be passed to different workers and while someone calls the first item the next worker will always get the next one)

So with zero dependencies i would do something like this:

const sleep = n => new Promise(rs => setTimeout(rs, 1000))

async function sequentialDownload(iterator) {
  for (let [index, url] of iterator) {
    // figure out where to save the file
    const path = path.resolve(__dirname, 'images', index + '.jpg')
    // download all images as a stream
    const res = await axios.get(index, { responseType: 'stream' })

    // pipe the stream to disc
    const writer = fs.createWriteStream(path)
    res.data.pipe(writer)

    // wait for the download to complete
    await new Promise(resolve => writer.on('finish', resolve))
    // wait a extra 5 sec
    await sleep(5000)
  }
}

const arr = [url1, url2, url3] // to be downloaded
const workers = new Array(20) // create 20 "workers"
  .fill(arr.entries()) // fill it with same iterator
  .map(sequentialDownload) // start working

Promise.all(workers).then(() => {
  console.log('done downloading everything')
})