I'm use mikeal/request to make API calls. One of the API's I use most frequently (the Shopify API). Recently put out a new call limit, I'm seeing errors like:
Exceeded 6.0 calls per second for api client. Slow your requests or contact support for higher limits.
I've already gotten an upgrade, but regardless of how much bandwidth I get I have to account for this. A large majority of the requests to the Shopify API are within async.map() functions, which loop asynchronous requests, and gather the bodies.
I'm looking for any help, perhaps a library that already exists, that would wrap around the request module and actually block, sleep, throttle, allocate, manage, the many simultaneous requests that are firing off asynchronously and limit them to say 6
requests at a time. I have no problem with working on such a project if it doesn't exist. I just don't know how to handle this kind of situation, and I'm hoping for some kind of standard.
I made a ticket with mikeal/request.
I've run into the same issue with various APIs. AWS is famous for throttling as well.
A couple of approaches can be used. You mentioned async.map() function. Have you tried async.queue()? The queue method should allow you to set a solid limit (like 6) and anything over that amount will be placed in the queue.
Another helpful tool is oibackoff. That library will allow you to backoff your request if you get an error back from the server and try again.
It can be useful to wrap the two libraries to make sure both your bases are covered: async.queue to ensure you don't go over the limit, and oibackoff to ensure you get another shot at getting your request in if the server tells you there was an error.
For an alternative solution, I used the node-rate-limiter to wrap the request function like this:
The
npm
package simple-rate-limiter seems to be a very good solution to this problem.Moreover, it is easier to use than
node-rate-limiter
andasync.queue
.Here's a snippet that shows how to limit all requests to ten per second.
In async module, this requested feature is closed as "wont fix"
There is a solution using leakybucket or token bucket model, it is implemented "limiter" npm module as RateLimiter.
RateLimiter
, see example here: https://github.com/caolan/async/issues/1314#issuecomment-263715550Another way is using
PromiseThrottle
, I used this, working example is below:Output:
We can clearly see the rate from output, i.e. 5 calls for every second.
Here's my solution use a library
request-promise
oraxios
and wrap the call in this promise.The other solutions were not up to my tastes. Researching further, I found promise-ratelimit which gives you an api that you can simply
await
:The above example will ensure you only make queries to
api.example.com
every 2000ms at most. In other words, the very first request will not wait 2000ms.