I am new to JavaScript and I really got confused by the documentation about promises.
I have the following case here where I have a punch of users and for each user I execute an async function in which I do some calculations on that user and add the result along with the user to an array. From what I understood from the documentation that I need to get a promise for each time the async function is executed and add all the promises to a list of promises that resolve when the resultant array is passed to it as following:
someFunction = () => {
var promises = [];
users.forEach(user => {
var promise = asyncFunction(user).callback(callBackValue => {
// Run some checks and add the user to an array with the result
if (checksAreGood) {
usersArray.push({user: user, result: callBackValue});
}
});
promises.push(promise);
});
return Promise.all(promises).then(() => Promise.resolve(matches));
};
Question is: if the number of the users I am traversing is unknown and I want to limit the number of users added to the array to 20 if and only if the number of users are more than 20 otherwise add all the users. In other words, resolve the promise when the array is full of 20 users or less.
The purpose for this is to avoid executing the async functions for the whole number of users given to optimize performance. Means, if I have 1000 users I want the async function to be executed until the array is full to 20 only.
The first solution which only searches until 20 users where found would be to traverse one user after another :
async function someFunction(){
const results = [];
for(const user of users){
const result = await asyncFunction(user);
// Run some checks and add the user to an array with the result
if(!someChecksGood) continue;
results.push(result);
if(results.length >= 20) break;
}
return results;
}
While this works "perfect", its quite slow as it only processes one request at a time. So the opposite solution would be to run all requests at a time and cancel them if the array is already full:
async function someFunction(){
const results = [];
async function process(user){
const result = await asyncFunction(user);
if(!someChecksGood || results.length >= 20) return;
results.push(result);
}
await Promise.all(users.map(process));
return results;
}
But now there is a high number of unneccessary requests, that are discarded afterwards. To improve this, one could combine both approaches above by "chunking" the requests, which should not decrease request time that much as dbs can only process a certain amount of requests at a time, but the good thing is that we can stop processing when the array is full, and only the rest of the "chunk" is unneccessary proccessed, so in average it should be better than both solutions above:
async function someFunction(){
//Chunk the users
const chunks = [], size = 5;
for(var i = 0; i < users.length; i += size)
chunks.push( users.slice(i, i + size));
//the method to create the results:
const results = [];
async function process(user){
const result = await asyncFunction(user);
if(!someChecksGood || results.length >= 20) return;
results.push(result);
}
//iterate over the chunks:
for(const chunk of chunks){
await Promise.all(chunk.map(process));
if(results.length >= 20) break;
}
return results;
}