memo-async
combine async / promise calls and cache result in memory with LRU
const getUserInfo = const user1 = await // send requestconst user2 = await // (cached) re-use 1st requestconst user3 = await // send requestconst user4 = await // (cached) re-use 1st request // in a short time... const user5 = await // get cached result, // or send request if last request failed // after seconds... const user6 = await // send request (cache expired)
API
this package provides memoAsync
, which can be a utility function, or a decorator:
-
memoAsync(fn, opts)
returns an wrapped async function, which may combine calls and cache result in memory.- fn :
Function
- your async function - opts :
MemoOptions
- optional, see below
- fn :
-
memoAsync(opts)
returns a class method decorator- opts :
MemoOptions
- optional, see below
Note: each instance has its own LRU cache in memory by default.
If you have many instances, consider using exact one LRUCache by setting
opts.cache
. Meanwhile, do not forget writing aopts.genKey
decorator example
@async {// some expensive requests}const joe =// now joe.readData() may merge and cache requests - opts :
MemoOptions
-
genKey :
(...args) => string
compute the cache key from arguments.
default: treat args as strings and concat them
if you are using memoAsync within a class, you may use
this
while computing -
duration :
number
duration of one batch, aka. how long a result can be cached.
default: 3000 (ms)
-
batchSize :
number
how many requests (invoking) can be merged into one.
default: 500 (# req)
-
cache :
LRUCache
use an existing lru-cache instance.
if not set, memoAsync will create one.
-
cacheSize :
number
set the cache capacity. works when
cache
is not given.default: 1000
-
onHit :
(key, result, args) => void
callback when cache is hit.-
key :
string
the cache key -
result :
Promise
the cached Promise. you cannot change it -
args :
any[]
array of arguments
Note: if you are using memoAsync within a class,
this
will be set. -