Memoizee
Complete memoize/cache solution for JavaScript
Originally derived from es5-ext package.
Memoization is best technique to save on memory or CPU cycles when we deal with repeated operations. For detailed insight see: http://en.wikipedia.org/wiki/Memoization
Features
- Works with any type of function arguments – no serialization is needed
- Works with any length of function arguments. Length can be set as fixed or dynamic.
- One of the fastest available solutions.
- Support for promises and asynchronous functions
- Primitive mode which assures fast performance when arguments are convertible to strings.
- WeakMap based mode for garbage collection friendly configuration
- Can be configured for methods (when
this
counts in) - Cache can be cleared manually or after specified timeout
- Cache size can be limited on LRU basis
- Optionally accepts resolvers that normalize function arguments before passing them to underlying function.
- Optional reference counter mode, that allows more sophisticated cache management
- Profile tool that provides valuable usage statistics
- Covered by over 500 unit tests
Installation
In your project path — note the two e
's in memoizee
:
$ npm install memoizee
memoize
name was already taken, therefore project is published as memoizee
on NPM.
To port it to Browser or any other (non CJS) environment, use your favorite CJS bundler. No favorite yet? Try: Browserify, Webmake or Webpack
Usage
var memoize = ; var { /* ... */ }; memoized = ; ;; // Cache hit
Configuration
All below options can be applied in any combination
Arguments length
By default fixed number of arguments that function take is assumed (it's read from function's length
property) this can be overridden:
memoized = ; ; // Assumed: 'foo', undefined; // Cache hit ; // Third argument is ignored (but passed to underlying function); // Cache hit
Dynamic length behavior can be forced by setting length to false
, that means memoize will work with any number of arguments.
memoized = ; ;; // Cache hit;; // Cache hit ;;; // Cache hit
Primitive mode
If we work with large result sets, or memoize hot functions, default mode may not perform as fast as we expect. In that case it's good to run memoization in primitive mode. To provide fast access, results are saved in hash instead of an array. Generated hash ids are result of arguments to string conversion. Mind that this mode will work correctly only if stringified arguments produce unique strings.
memoized = ; ;; // Cache hit
Cache id resolution (normalization)
By default cache id for given call is resolved either by:
- Direct Comparison of values passed in arguments as they are. In such case two different objects, even if their characteristics is exactly same (e.g.
var a = { foo: 'bar' }, b = { foo: 'bar' }
) will be treated as two different values. - Comparison of stringified values of given arguments (
primitive
mode), which serves well, when arguments are expected to be primitive values, or objects that stringify naturally do unique values (e.g. arrays)
Still above two methods do not serve all cases, e.g. if we want to memoize function where arguments are hash objects which we do not want to compare by instance but by its content.
Writing custom cache id normalizers
There's a normalizer
option through which we can pass custom cache id normalization function
e.g. if we want to memoize a function where argument is a hash object which we do not want to compare by instance but by its content, then we can achieve it as following:
var mfn = ; ;; // Cache hit
Argument resolvers
When we're expecting arguments of certain type it's good to coerce them before doing memoization. We can do that by passing additional resolvers array:
memoized = ; ;; // Cache hit; // Cache hit
Note. If your arguments are collections (arrays or hashes) that you want to memoize by content (not by self objects), you need to cast them to strings, for it's best to just use primitive mode. Arrays have standard string representation and work with primitive mode out of a box, for hashes you need to define toString
method, that will produce unique string descriptions, or rely on JSON.stringify
.
Similarly if you want to memoize functions by their code representation not by their objects, you should use primitive mode.
Memoizing asynchronous functions
Promise returning functions
With promise option we indicate that we memoize a function that returns promise.
The difference from natural behavior is that in case when promise was rejected with exception, the result is immediately removed from memoize cache, and not kept as further reusable result.
var { return { ; };};memoized = ; ;; // Cache hit
Important notice on internal promises handling
To avoid error swallowing and registration of error handlers, done
and finally
(if implemented) are preferred over then
Still relying on done
& finally
pair, may cause trouble if implementation that's used throws rejection reasons when done
is called with no onFail callback, even though error handler might have been registered through other then
or done
call.
If that's the case for you, you can force to not use finally
or done
(even if implemented) by providing following value to promise
option:
'done'
- Ifdone
is implemented, it will purely try usedone
to register internal callbacks and notfinally
(even if it's implemented). Ifdone
is not implemented, this setting has no effect and callbacks are registered viathen
.
This mode comes with side effect of silencing eventual 'Unhandled errors' on returned promise'then'
- No matter ifdone
andfinally
are implemented, internal callbacks will be registered viathen
.
This mode comes with side effect of silencing eventual 'Unhandled errors' on returned promise
memoized = ;
Node.js callback style functions
With async option we indicate that we memoize asynchronous (Node.js style) function Operations that result with an error are not cached.
{ ;};memoized = ; ; ;
Memoizing methods
When we are defining a prototype, we may want to define a method that will memoize it's results in relation to each instance. A basic way to obtain that would be:
var { thisbar = ; // ... constructor logic};Fooprototype { // ... method logic};
There's a lazy methods descriptor generator provided:
var d = ;var memoizeMethods = ; var { // ... constructor logic};Object;
WeakMap based configurations
In this case memoization cache is not bound to memoized function (which we may want to keep forever), but to objects for which given results were generated.
This mode works only for functions of which first argument is expected to be an object.
It can be combined with other options mentioned across documentation. However due to WeakMap specificity global clear is not possible.
var memoize = ; var memoized = ; var obj = foo: true bar: false ;;; // Cache hit
Cache handling
Manual clean up:
Delete data for particular call.
memoized;
Arguments passed to delete
are treated with same rules as input arguments passed to function
Clear all cached data:
memoizedclear;
Expire cache after given period of time
With maxAge option we can ensure that cache for given call is cleared after predefined period of time (in milliseconds)
memoized = ; // 1 second ;; // Cache hit;
Additionally we may ask to pre-fetch in a background a value that is about to expire. Pre-fetch is invoked only if value is accessed close to its expiry date. By default it needs to be within at least 33% of maxAge timespan before expire:
memoized = ; // Defaults to 0.33 ;; // Cache hit ; ; ;
Pre-fetch timespan can be customized:
memoized = ; ;; // Cache hit ; ;
Thanks @puzrin for helpful suggestions concerning this functionality
Reference counter
We can track number of references returned from cache, and manually delete them. When the last reference is cleared, the cache is purged automatically:
memoized = ; ; // refs: 1; // Cache hit, refs: 2; // Cache hit, refs: 3memoized; // refs: 2memoized; // refs: 1memoized; // refs: 0, Cache purged for 'foo', 3; // Re-executed, refs: 1
Limiting cache size
With max option you can limit cache size, it's backed with LRU algorithm, provided by low-level lru-queue utility.
The size relates purely to count of results we want to keep in cache, it doesn't relate to memory cost associated with cache value (but such feature is likely to be introduced with next version of memoizee).
memoized = ; ;;; // Cache hit; // Cache hit; // Cache cleared for 'foo', 3; // Cache hit; // Re-executed, Cache cleared for 'lorem', 11; // Re-executed, Cache cleared for 'bar', 7; // Cache hit; // Re-executed, Cache cleared for 'lorem', 11
Registering dispose callback
You can register a callback to be called on each value removed from the cache:
memoized = ; var foo3 = ;var bar7 = ;memoizedclear'foo' 3; // Dispose called with foo3 valuememoizedclear'bar' 7; // Dispose called with bar7 value
Benchmarks
Simple benchmark tests can be found in benchmark folder. Currently it's just plain simple calculation of fibonacci sequences. To run it you need to install other test candidates:
$ npm install underscore lodash lru-cache secondary-cache
Example output taken under Node v0.10.35 on 2011 MBP Pro:
Fibonacci 3000 x10:
1: 15ms Memoizee (primitive mode)
2: 15ms Underscore
3: 18ms lru-cache LRU (max: 1000)
4: 21ms secondary-cache LRU (max: 1000)
5: 37ms Lo-dash
6: 62ms Memoizee (primitive mode) LRU (max: 1000)
7: 163ms Memoizee (object mode) LRU (max: 1000)
8: 195ms Memoizee (object mode)
Profiling & Statistics
If you want to make sure how much you benefit from memoization or just check if memoization works as expected, loading profile module will give access to all valuable information.
Module needs to be imported before any memoization (that we want to track) is configured. Mind also that running profile module affects performance, it's best not to use it in production environment
var memProfile = ;
Access statistics at any time:
memProfilestatistics; // Statistics accessible for programmatic useconsole; // Output statistics data in readable form
Example console output:
------------------------------------------------------------
Memoize statistics:
Init Cache %Cache Source location
11604 35682 75.46 (all)
2112 19901 90.41 at /Users/medikoo/Projects/_packages/next/lib/fs/is-ignored.js:276:12
2108 9087 81.17 at /Users/medikoo/Projects/_packages/next/lib/fs/is-ignored.js:293:10
6687 2772 29.31 at /Users/medikoo/Projects/_packages/next/lib/fs/watch.js:125:9
697 3922 84.91 at /Users/medikoo/Projects/_packages/next/lib/fs/is-ignored.js:277:15
------------------------------------------------------------
- Init – Initial hits
- Cache – Cache hits
- %Cache – What's the percentage of cache hits (of all function calls)
- Source location – Where in the source code given memoization was initialized
Tests
$ npm test
Project cross-browser compatibility to be supported by: