Point free async utilities
This library designed to encourage composition of a program from Node.js style asynchronous primitives. It provides generally useful:
- decorators - wrappers that alter async function behavior some way,
- combinators - things that combine several functions into one.
Installation
npm install point-free
API
Decorators | Combinators | Control Flow | Collections | Primitives |
---|---|---|---|---|
|
|
|
|
Decorators
Decorator wrap a function and either provide additional functionality or alter its semantics for particular case. Common usage: logging, retrying or limiting operations.
retry([options | attempts = 5], func)
Makes a function retrying func
up to attempts
times, behaving the same otherwise.
If amount of attempts as exceeded then last error is returned.
Options:
attempts
- number of attempts to runfunc
, defaults to 5.timeout
ortimeout(failed)
- a number of milliseconds to wait between tries. If specified as function then it is called with a number of failed attempts passed.factor
- timeout will be multiplied by this value for each failed attempt but first. A shortcut to implement exponential backoff.
This way one can make fetchURL
use 5 attempts with timeouts 1, 2, 4, 8 and 16 seconds:
var fetchURL = pf; { // ...}
limit([options | limit], func)
Limit number of concurrent executions of a func
. Excessing calls will be queued and executed in FIFO order.
Options:
limit
- number of concurrent executions allowed.by
- limit only those calls clashing by values ofby(args..)
.
Here is how you can limit HTTP requests to 4 by domain and 50 overall:
var fetchURL = pf; { // ...}
By specifying limit to be 1 you can force calls to be sequential. E.g. in map:
var { return pf;}
TODO: document introspection and .emptyQueue()
fallback(defaultValue, func)
Returns a version of func
that never fails, but returns defaultValue
instead.
E.g. this function returns 'unknown'
if any of waterfall components fail:
var detectPageLanguage = pf;
logCalls([logger = console.log], func)
On each function call pass its arguments
to logger
. Aimed to use for logging and debugging in a way like:
var fetchURL = ;// ... use fetchURL same as before, look at urls passed.
logExits([logger = console.log], func)
On each function callback call pass its arguments
to logger
. Useful to trace async function results.
logErrors([logger = console.error], func)
Pass all function errors to logger
. They are still passed the normal way too. Can be used with a third party logger utility like debug:
var debug = 'my-module';var shakyFunc = ;// ... use shaky func as usual while seeing its errors.
Combinators
waterfall(funcs...)
Combines several functions to be executed serially with results of each function passed to next one. Arguments to resulting function before callback are passed to the first step. Results of last function will be returned as a result of combined action. Any error will be passed out immediately, stopping chain of execution.
var pf = ; var displayFile = pf;
When it's not possible to pass everything to first function waterfall()
could be enclosed and either called immediately...:
{ pfcallback;}
... or passed to other combinator or decorator:
pf
serial(funcs... | funcs)
Combines several actions into one executing them serially, arguments to combined action passed to each subtask. Results of subtasks are combined into array preserving order. If an error occurs it's passed out immediately, stopping chain of execution.
// Note same argumentsvar {...};var {...};var cleanup = pf;
Most commonly used to construct an operation from several async steps:
pf { ;}
parallel(funcs... | funcs)
Combines several actions into one executing them in parallel. Arguments are passed to each subtask, results are collected into array preserving order. Any error is passed out immediately, all functions still running parallel continue, but their results are ignored.
var recalcAll = pf;
Can be used to create a function as above or as a substep in a bigger combinator:
pfnext
auto(jobs)
Automatically resolves dependencies and executes subtasks in appropriate order and in parallel if possible. Results of dependent calls are passed as parameters to dependent actions. In the end all the subtask results are combined into an object with corresponding properties.
Here jobs
and stats
are executed in parallel, their results are passed to report
function,
then its result is passed to send
function:
pfdone
manual(states)
A way to create asynchronous state machine. Accepts an object with steps, call next.name()
or
use it as callback to progress to next step.
start
and end
steps are special: execution always starts from start
and
calling next.end()
will stop machine and pass a result out:
{ var filename = __dirname + '/cache/' + url; return pf;} { ...}
while(test, body)
Creates an asynchronous function repeatively calling body
while test
condition holds:
var waitForLock = pf; pf
doWhile(body, test)
Same as while, but test
condition is checked after body
execution:
var bytesReceived = 0;var readChunk = pf;
Primitives
noop
A nice thing when you want to do something conditionally:
pf
sleep(timeout)
Delays any subsequent actions in a pipeline. Could be used with serial and waterfall:
var delayedHandler = pf;
clear
Ignores it's arguments and just calls a callback. Intended to be used in waterfall pipeline to ignore all results from previous function:
var acquireTask = pf
Collections
each(seq, func)
Execute func
for each item in seq
in parallel and ignore results.
If any sub-call fails then entire call fails immediately.
var pingHosts = pf;
map(seq, func)
Execute func
for each item in seq
in parallel and collect results into array preserving order.
If any sub-call fails then entire call fails immediately.
pfdone
chunk(size, seq, func)
Chunk seq
and process each chunk with func
serially.
Chunks will be sized up to size
.
Any arrays returned by processing function are combined into single resulting array,
non-array results are ignored.
// Insert links into database in chunks of 1000pfdone