I'm in a situation where I need execute async functions in "parallel", and continue program execution with the best result. Thus I wrote something like this :
var p = [];
for (var i = 0; i < 10; ++i) (function (index) {
p.push(new Promise(function (resolve, reject) {
setTimeout(function () {
var success = Math.random() > 0.7;
console.log("Resolving", index, "as", success ? "success" : "failure");
success && resolve(index);
}, Math.random() * 5000 + 200);
}));
})(i);
Promise.race(p).then(function (res) {
console.log("FOUND", res);
}).catch(function (err) {
console.log("ERROR", err);
});
Now, I'm wondering if this is good practice when working with promises? Is not resolving or rejecting them more often then anything create memory leaks? Are they all eventually GC'ed every time?
The only reason this will leak is because
p
is a global. Setp = null;
at the end, or avoid using a global variable:Promise.race
lets go ofp
as soon as one entry inp
succeeds or something fails, whichever is sooner, andsetTimeout
will have let go of everything after 5.2 seconds. JavaScript will then happily garbage collect the promises whether they've been resolved, rejected or neither. No harm.The only thing you want to avoid is garbage collecting rejected promises, as that is likely to trigger a browser warning, because it is indicative of a web programming bug.
Of course, there's a 3% chance none of them resolve, in which case this will leak (until you close the tab).
Is this good design?
I think it depends on what the functions are doing. If they are lightweight, then I don't see a problem. If they are doing heavy computations or have side-effects, then they hopefully come with some API to cancel the operations, to help save on resources.