I have an array that contains an array of promises, and each inner array could have either 4k, 2k or 500 promises.
In total there are around 60k promises and I may test it with other values as well.
Now I need to execute the Promise.all(BigArray[0])
.
Once the first inner array is done, I need to execute the next Promise.all(BigArray[1])
and so on and so on.
If I try to execute a Promise.all(BigArray)
its throwing:
fatal error call_and_retry_2 allocation failed - process out of memory
I need to execute it each of promises in series, not in parallel which I think that’s what Node its doing.
I shouldn't use new libs however am willing to consider the answer!.
Edit:
Here is an example piece of code:
function getInfoForEveryInnerArgument(InnerArray) {
const CPTPromises = _.map(InnerArray, (argument) => getDBInfo(argument));
return Promise.all(CPTPromises)
.then((results) => {
return doSomethingWithResults(results);
});
}
function mainFunction() {
BigArray = [[argument1, argument2, argument3, argument4], [argument5, argument6, argument7, argument8], ....];
//the summ of all arguments is over 60k...
const promiseArrayCombination = _.map(BigArray, (InnerArray, key) => getInfoForEveryInnerArgument(InnerArray));
Promise.all(promiseArrayCombination).then((fullResults) => {
console.log(fullResults);
return fullResults;
})
}
Promise.all
will not work, you could use Array.reduce
to process BigArray
elements, one by one:
BigArray.reduce((promiseChain, currentArray) => {
return promiseChain.then(chainResults =>
Promise.all(currentArray).then(currentResult =>
[...chainResults, currentResult]
)
);
}, Promise.resolve([])).then(arrayOfArraysOfResults => {
// Do something with all results
});
Promise.all()
is going to check each of your promise results that are passed in as arguments in parallel, and will reject upon the first error, or resolve upon completion of all the promises.
From the MDN:
Promise.all passes an array of values from all the promises in the iterable object that it was passed. The array of values maintains the order of the original iterable object, not the order that the promises were resolved in. If something passed in the iterable array is not a promise, it's converted to one by Promise.resolve.
If any of the passed in promises rejects, the all Promise immediately rejects with the value of the promise that rejected, discarding all the other promises whether or not they have resolved. If an empty array is passed, then this method resolves immediately.
If you need to execute all of your promises in series, then the Promise.all()
method will not work for your application. Instead, you need to find an iterative approach to resolving your promises. This is going to be difficult; node.js is asynchronous in nature, and using loops (to my knowledge and experience), will not block until a response is received from a promise within a loop.
Edit:
A library exists called promise-series-node, which I think may help you out quite a bit here. Since you already have the promises created, you could just pass it your BigArray
:
promiseSeries(BigArray).then( (results) => {
console.log(results);
});
In my personal opinion, your approach of starting with 60k+ promises will not only take a substantial amount of time, but also resources on the system executing them (which is why you are running out of memory). I think that you may want to consider a better architecture for the application.
Edit2, What is a promise?::
A promise represents the result of an asynchronous operation, which can take one of three states:
- Pending: The start state of the promise
- Fulfilled: State of promise represented by a successful operation
- Rejected: State of promise represented by an failed operation
Promises are immutable once they are in fulfilled, or rejected states. You can chain promises (great for avoiding repeated callbacks), as well as nest them (when closure is a concern). There are many great articles on the web for this, here is one I found to be informative.
Pretty simple to accomplish with async/await in ES2017:
(async () => {
for (let i = 0; i < BigArray.length; i++) {
await Promise.all(BigArray(i));
}
})();
The promise library bluebird offers a helper method called Promise.map, which takes an array or a promise of an array as its first argument and maps all its elements to a result array, which in turn also gets promisified. Maybe you could try something like this:
return Promise.map(BigArray, function(innerArray) {
return Promise.all(innerArray);
})
.then(function(finishedArray) {
// code after all 60k promises have been resolved comes here
console.log("done");
});
But as already stated before, this is still a very resource intense task that may consume all available memory.
good answer here Callback after all asynchronous forEach callbacks are completed
function asyncFunction (item, cb) {
setTimeout(() => {
console.log('done with', item);
cb(item*10);
}, 1000*item);
}
let requests = [1,2,3].map((item) => {
return new Promise((resolve) => {
asyncFunction(item, resolve);
});
})
Promise.all(requests).then(
// () => console.log('done')
function(arr){
console.log(arr)
console.log('done')
}
);