I have an application which has to make 50 to 100 API calls in a loop. Rotten Tomatoes have a limit of 10 calls per second. As a result of this my requests fail in between and i get different results everytime. What is an effective way to make these 50 requests without exceeding the 10 reqs/per second limit ? Here's my code:
$.each(elem, function (index, item) {
var $placeholder = $('<div>').appendTo("div.content");
$.ajax({
type: 'post' ,
url: moviesSearchUrl + '&q=' + encodeURI(item) + '&page_limit=1',
dataType: "jsonp",
async: false,
success: searchCallback
});
function searchCallback(data) {
var movies = data.movies;
var markup = index + ': '+ movies[0].title + '<img class=" bord" src="' + movies[0].posters.thumbnail + '" /><br/>';
$placeholder.replaceWith(markup);
}
});
It depends what you're trying to do. If the result is going straight (in line) into a user web page you're rendering (and there are no bulk calls which you can do instead of the individual ones) then there's little you can do (you'll take minimum 5 seconds to render that page).
If you're reusing the same content often, then if the service provider's terms allow it may be worth caching the results of the calls for a short period to avoid having to hit all the calls again and again.
Agree with the point above - if you're rendering directly to a web page with multiple users you'll suffer badly - best think of a short caching strategy.