This question already has an answer here:
-
Wait until all jQuery Ajax requests are done?
22 answers
I have a page that, using jQuery .ajax that is called 100 times (async: true), the problem is that, when they they are all being loaded, I need the system to wait for ALL 100 calls to return before continuing. How would I go about this?
Thanks in advance! :)
Update:
These calls are made in a for() loop (there's 100 of them :))
The nice way to do this is with $.when
. You can use this as follows:
$.when(
$.ajax({/*settings*/}),
$.ajax({/*settings*/}),
$.ajax({/*settings*/}),
$.ajax({/*settings*/}),
).then(function() {
// when all AJAX requests are complete
});
Alternatively, if you have all the AJAX calls in an array, you could use apply
:
$.when.apply($, ajaxReqs);
Note that this requires at least jQuery 1.5.
To add the AJAX requests to an array, do something like this:
var ajaxReqs = [];
for (var i = 0; i < 100; i++) {
ajaxReqs.push($.ajax({
/* AJAX settings */
});
}
$.when.apply($, ajaxReqs).then(function() {
// all requests are complete
});
This kind of issue crops up everywhere in database SQL design - and although its not precisely the same - the issue is exactly the thing that'll get you: network communications.
You need to watch out for it because it'll come back to bite you if you don't do it correctly - ie users will get severely annoyed HAVING to wait. I can't emphasize this enough.
Here's the scenario:
You want to transfer many small snippets of information from your server upon request.
Each request depends on a number of factors operating efficiently. All of which are OUT OF YOUR CONTROL:
Wide area Network reposnse time (anywhere in the world right?)
Locak area Network reposnse time (anywhere in the building)
WebServer Load
WebServer Response time
Database response time
Backend Script run time
Javascript run time to process the result
The fact that the browsers are generally limited to 6-8 parallel AJAX requests at once (I think - someone correct me on the exact number)
Multiply that by request (erm...in your case x 100)
Get the picture?
It might work blissfully well in test on a local machine. You might even be running your own db and webserver on the exact same machine... but try that in the wild and before long you'll hit unreliability as an issue.
Listen, the simplist thing to do is wrap up ALL your parameters into ONE JS array and send that in ONE POST request. Then on the server do all your database selects and roll up the responses into ONE JSON/XML reponse.
At that point you are only ever waiting for ONE AJAX response. You can find all your data in the JSON/XML result.
Given that you are working with 100 requests you would probably be able to actually measure the time saving with a stopwatch!
Take it from me - do as few network requests as possible.
You can use Deferred object api. As long as $.ajax returns deferred object you can try the following code to use loop:
var ajaxes = [];
for (i = 1; i < 100; i++) {
ajaxes[i] = $.ajax({/*data*/});
}
$.when.apply( ajaxes )
.then(function(){
console.log( 'I fire once all ajax requests have completed!' );
})
.fail(function(){
console.log( 'I fire if one or more requests failed.' );
});
P.S. There is a great article regarding using deferred objects by Eric Hynds Using Deferreds in jQuery 1.5
100 Ajax calls? Seems a very odd requirement and what you're trying to achieve could quite possibly be achieved using another method:
var i = 0;
function myCallback() {
alert('Completed 100 times');
}
function doAjax() {
$.ajax({
url: 'blah.php',
data: 'hello=world',
success: function(response) {
i++;
},
complete: function() {
if(i < 100) {
doAjax();
} else {
myCallback();
}
}
});
}
doAjax(); // start
I'm not a JS guy but I would create a global variable, this would be the counter, and define a timed function to check this global var periodically if it reached 100 or not.
EDIT: setTimeout()