Search results

15 packages found

Crawler is a ready-to-use web spider that works with proxies, asynchrony, rate limit, configurable request pools, jQuery, and HTTP/2 support.

published 2.0.2 3 months ago
M
Q
P

A set of functions for coding easy to read HTTP requests.

published 3.2.1 a year ago
M
Q
P

A set of functions for coding easy to read HTTP requests.

published 1.0.5-beta 5 months ago
M
Q
P

make streaming http requests

published 2.1.3 7 years ago
M
Q
P

library for limit the maximum number of requests

published 1.2.6 2 years ago
M
Q
P

![logo](./.github/logo.png)

published 1.0.2 8 years ago
M
Q
P

make streaming http requests

published 1.0.1 7 years ago
M
Q
P

A simple node module to make http requests using a pool.

published 1.6.1 9 years ago
M
Q
P

A simple request queue with channels and a limit of parallel connections for both the entire queue and a specific channel.

published 1.0.2 4 years ago
M
Q
P

make streaming http requests

published 2.1.2 7 years ago
M
Q
P

Create request pool queue for all types of http request

published 0.2.5 5 years ago
M
Q
P

An intelligent proxy and request manager.

published 0.1.4 7 years ago
M
Q
P

An intelligent proxy and request manager.

published 0.1.5 7 years ago
M
Q
P

A NodeJS module that uses a pool of tokens to make authenticated requests to APIs and extend their limits.

published 0.1.4 10 years ago
M
Q
P

A Node-RED node for performing http(s) requests that use Request library with optimized proxy support. updated for pool value exposure

published 0.2.0 7 years ago
M
Q
P