uber-cache

Async in-memory that sets the interface for a number of uber-* caching engine: Memcached, Redis, MongoDB, LevelDB

uber-memoize - Async memoize for uber-* engines

If you want to know about memoization read this http://en.wikipedia.org/wiki/Memoization

npm install uber-memoize

Most of the useful caching engines have a async interface due to the evented IO required so it is necessary to use a callback style when manipulating the cache.

 
var ttlInSeconds = 1
  , someData = { some: 'data' }
  ;
 
cache.set('some-key', someData, ttlInSeconds, function(errorcachedItem) {
  if (error) {
    // Handle the error 
    return false
  }
 
  console.log('Cache written key:' + cachedItem.key + ' value:' + cachedItem.value)
})
 
// Later that day, but before the TTL. 
cache.get('some-key', function(errorcachedItem) {
  if (error) {
   // Handle the error 
    return false
  }
 
  console.log('Cache from key:' + cachedItem.key + ' value:' + cachedItem.value)
})
 
  • set(key, value, ttl, callback)

    ttl milliseconds until expiry. Optional

  • get(key, callback)

  • del(key, callback)

  • clear(callback)

  • size(callback)

  • memoize(id, fn, ttl)

    Returns a function that will cache the results of a slow function for ttl or until lru clears it out.

  • miss(key)

    Emitted when a get(key) fails to find a valid cached item with that key.

  • hit(key, value, ttl)

    Emitted when a get(key) finds a valid item in the cache.

  • stale(key, value, ttl)

    Emitted when a get(key) can’t find a valid item but a stale item still exists.

The uber-cache engines are decoupled from the main project. Unlike other modules that force you to install dependencies for things you not going to use, Uber Cache engines are self contained modules that you also include into your project and pass to Uber Cache on instantiation.

Currently the following engines are available:

  • Memory - This special case is the base class of cache and is included in the main module. This stores your cache in the memory of the current process. This is suitable for small applications but may cause cache invalidation problems if you start using cluster.
  • Redis - http://github.com/serby/uber-cache-redis - Redis backed cache. TTL can't be less than 1 second due to a limitation in redis TTL.
  • MongoDB - http://github.com/serby/uber-cache-mongodb - MongoDB backed cache.
  • LevelDB - http://github.com/serby/uber-cache-leveldb - LevelDB backed cache.

Paul Serby follow me on twitter

Licenced under the New BSD License