A simple caching library, inspired by the Play cache API and biased towards showing stale data instead of dog piling. The interface only exposes very limited functionality, there's no multi-get or deletion of cached data. The library is designed to support different caching backends, though right now only memcached is implemented.
It supports both promise- and callback-based usage.
npm install --save cached
More detailed API docs are in the next section.
cached = ;kittens = ;// Set a key using a plain valuekittens;// Set a key using a lazily created promise (or value)kittens;// Set a key using a callback-style functionkittens;kittens;kittens;// Handle it the promise waykittens;
A thin wrapper around memcached.
You can either provide a readily configured client or a combination of hosts and additional options.
Without any additional options it will default to a local memcached on
var Memcached = ;;
This will create the same cache as above.
Stores all the data in an in-memory object.
get() will return a reference to the stored value. Mutating the returned value will affect the value in the cache.
Doesn't store data at all. All
set operations succeed and
get operations behave as if the value were not found in the cache.
Creates a new named cache or returns a previously initialized cache.
"cars", all keys will be prefixed with
typeproperty. If no backend is configured, the cache will run in "noop"-mode, not caching anything. All other properties are forwarded to the backend, see using different backends for which backend types exist and what options they support.
This allows you to circumvent the global named caches. The options are the same as above, just
name is also part of the
options object when using this function.
Drop the given named cache.
Drop all named caches.
Convert a node-style function that takes a callback as its first parameter into a parameterless function that generates a promise.
In other words: this is what you'd want to wrap your node-style functions in when using them as value arguments to
var f = cached;// f can now be called and the return value will be a promise;// More importantly it can be passed into cache.set;
Extends the current defaults with the provided defaults.
The two important ones are
expireis the time in seconds after which a value should be deleted from the cache (or whatever expiring natively means for the backend). Usually you'd want this to be
freshForis the time in seconds after which a value should be replaced. Replacing the value is done in the background and while the new value is generated (e.g. data is fetched from some service) the stale value is returned. Think of
freshForas a smarter
timeoutis the maximum time in milliseconds to wait for cache operations to complete. Configuring a timeout ensures that all
setoperations fail fast. Otherwise there will be situations where one of the cache hosts goes down and reads hang for minutes while the memcached client retries to establish a connection. It's highly recommended to set a timeout. If
undefined, no timeout will be set and the operations will only fail once the underlying client, e.g.
memcached, gave up.
Cache store operation.
key has to be a string, for possible
The value can be any of the following:
a. Anything that can be converted to JSON b. A Promise of (a) c. A function returning (a) or (b)
The callback will be called with the resolved value, following node conventions (error, value).
Cache retrieve operation.
key has to be a string.
Cache misses are generally treated the same as retrieving
null, errors should only be caused by transport errors and connection problems.
If you want to cache
undefined (e.g. 404 responses), you may want to wrap it or choose a different value, like
false, to represent this condition.
This is the function you'd want to use most of the time.
It takes the same arguments as
set but it will check the cache first.
If a value is already cached, it will return it directly (respond as fast as possible).
If the value is marked as stale (generated
n seconds ago with
n > freshFor), it will replace the value in the cache.
getOrElse calls concurrently encounter the same stale value, it will only replace the value once.
This is done on a per-instance level, so if you create many cache instances reading and writing the same keys, you are asking for trouble.
If you don't, the worst case is every process in your system fetching the value at once.
Which should be a smaller number than the number of concurrent requests in most cases.
Cache delete operation.
key has to be a string.