dynamic-throttled-queue
This project was forked from shaunpersad/throttled-queue
Dynamically throttles arbitrary code to execute between a minuimum and maximum number of times per interval. Best for making throttled API requests.
For example, making network calls to popular APIs such as Twitter is subject to rate limits. By wrapping all of your API calls in a throttle, it will automatically adjust your requests to be within the acceptable rate limits.
Unlike the throttle
functions of popular libraries like lodash and underscore, dynamic-throttled-queue
will not prevent any executions. Instead, every execution is placed into a queue, which will be drained at the desired rate limit.
Release Notes
v1.1.1 - Add default for option object
v1.1.0 - Adding Retry ability, if returning false, function will be added back to the master queue to be retired.
v1.0.0 - Initial Release
Installation
Can be used in a Node.js environment, or directly in the browser.
Node.js
npm install dynamic-throttled-queue
Browser
<script src="dynamic-throttled-queue.min.js"></script>
Options
Param | Type | Description |
---|---|---|
min_rpi | {number} |
Minimum requests per interval |
max_rpi | [number=min_rpi] |
Maximum requests per interval |
interval | {number} |
Number of milliseconds between each batch of requests |
evenly_spaced | [boolean=true] |
If true requests will be distributed throughout the interval time |
errors_per_second | [number=5] |
Number of errors per second before deciding to either increase or decrease the current rpi |
back_off | [boolean=true] |
If true and we hit the errors_per_interval watermark, we will back off for 1 interval |
retry | [number=0] |
If greater than 0, any failed callbacks, will be put back onto the queue to retry upto X times |
Usage
- If in node.js,
require
the factory function:
var throttledQueue = ;
Else, include it in a script tag in your browser and throttledQueue
will be globally available.
- Create an instance of a throttled queue by specifying the maximum number of requests as the first parameter, and the interval in milliseconds as the second:
const throttle = ; // at most 5 requests per second.
- Use the
throttle
instance as a function to enqueue actions:
;
Quick Examples
Basic
Rapidly assigning network calls to be run, but they will be limited to 1 request per second.
var throttledQueue = ;var throttle = ; // at most make 1 request every second. for let i = 0; i < 100; i++ ;
Reusable
Wherever the throttle
instance is used, your action will be placed into the same queue,
and be subject to the same rate limits.
const throttledQueue = ;const throttle = ; // at most make 1 request every minute. for let x = 0; x < 50; x++ ;for let y = 0; y < 50; y++ ;
Bursts
By specifying a number higher than 1 for the min_rpi, and setting evenly_spaced: false
you can dequeue multiple actions within the given interval:
var throttledQueue = ;var throttle = ; // at most make 10 requests every second. for let x = 0; x < 100; x++ ;
Evenly spaced
By default your actions are evenly distributed over the interval evenly_spaced: true
:
const throttledQueue = ;const throttle = ; // at most make 10 requests every second, but evenly spaced. for let x = 0; x < 100; x++ ;
Min & Max Requests Per Interval
By suppling a min_rpi
& max_rpi
value to the options object, you will be able to have a dynamically adjusting queue. This works by the function passed to throttle
returning false
if there was an issue. The starting requests per interval is as close to halfway bewteen the min_rpi
and max_rpi
, rounded to the nearest whole number.
The second part of this is the errors_per_second
option. This is set by default to 5 errors per second. Every X seconds, a check is made to see how many errors we have seen (through the use of return false
) and if we see X or more, then the current requests per interval will decrease until we hit the min_rpi
value. If between 0 - X errors are seen then we keep with the current requests per interval as is. Finally if there are 0 errors in the last check period then we will increase the current requests per interval until we reach max_rpi
.
const throttledQueue = ;const throttle = ; // at most make 5 requests every second. for let x = 0; x < 100; x++ ;
Backoff
By suppling backoff:true
in the options, every time we hit the errors_per_second
mark, we will backoff from the next batch of calls for 1 inteveral
const throttledQueue = ;const throttle = ; // at most make 10 requests every second, if more than 2 errors per second, then back off for 1 full interval of 1 second. for let x = 0; x < 100; x++ ;
Tests
Note: The tests take a few minutes to run. Watch the console to see how closely the actual rate limit gets to the maximum.
Node.js
Run npm test
.
Browser
Open test/index.html
in your browser.