Install: @travetto/cache
npm install @travetto/cache
# or
yarn add @travetto/cache
Provides a foundational structure for integrating caching at the method level. This allows for easy extension with a variety of providers, and is usable with or without Dependency Injection. The code aims to handle use cases surrounding common/basic usage.
The cache module requires an Expiry to provide functionality for reading and writing streams. You can use any existing providers to serve as your Expiry, or you can roll your own.
Install: provider
npm install @travetto/model-{provider}
# or
yarn add @travetto/model-{provider}
Currently, the following are packages that provide Expiry:
- DynamoDB Model Support - @travetto/model-dynamodb
- Elasticsearch Model Source - @travetto/model-elasticsearch
- MongoDB Model Support - @travetto/model-mongo
- Redis Model Support - @travetto/model-redis
- S3 Model Support - @travetto/model-s3
- PostgreSQL Model Service - @travetto/model-postgres
- MySQL Model Service - @travetto/model-mysql
- SQLite Model Service - @travetto/model-sqlite
- Memory Model Support - @travetto/model-memory
- File Model Support - @travetto/model-file
The caching framework provides method decorators that enables simple use cases. One of the requirements to use the caching decorators is that the method arguments, and return values need to be serializable into JSON. Any other data types are not currently supported and would require either manual usage of the caching services directly, or specification of serialization/deserialization routines in the cache config.
Additionally, to use the decorators you will need to have a CacheService object accessible on the class instance. This can be dependency injected, or manually constructed. The decorators will detect the field at time of method execution, which decouples construction of your class from the cache construction.
@Cache is a decorator that will cache all successful results, keyed by a computation based on the method arguments. Given the desire for supporting remote caches (e.g. redis, memcached), only asynchronous methods are supported.
Code: Using decorators to cache expensive async call
import { MemoryModelService } from '@travetto/model-memory';
import { Cache, CacheService } from '@travetto/cache';
async function request(url: string): Promise<string> {
let value: string;
// ...fetch content
return value!;
}
export class Worker {
myCache = new CacheService(
new MemoryModelService({ namespace: '' })
);
@Cache('myCache', '1s')
async calculateExpensiveResult(expression: string): Promise<string> {
const value = await request(`https://google.com?q=${expression}`);
return value;
}
}
The @Cache decorator supports configurations on:
-
name
the field name of the current class which points to the desired cache source. -
config
the additional/optional config options, on a per invocation basis-
keySpace
the key space within the cache. Defaults to class name plus method name. -
key
the function will use the inputs to determine the cache key, defaults to all paramsJSON.stringify
ied -
params
the function used to determine the inputs for computing the cache key. This is an easier place to start to define what parameters are important in ,caching. This defaults to all inputs. -
maxAge
the number of milliseconds will hold the value before considering the cache entry to be invalid. By default values will live infinitely. -
extendOnAccess
determines if the cache timeout should be extended on access. This only applies to cache values that have specified amaxAge
. -
serialize
the function to execute before storing a cacheable value. This allows for any custom data modification needed to persist as a string properly. -
reinstate
the function to execute on return of a cached value. This allows for any necessary operations to conform to expected output (e.g. re-establishing class instances, etc.). This method should not be used often, as the return values of the methods should naturally serialize to/fromJSON
and the values should be usable either way.
-
Additionally, there is support for planned eviction via the @EvictCache decorator. On successful execution of a method with this decorator, the matching keySpace/key value will be evicted from the cache. This requires coordination between multiple methods, to use the same keySpace
and key
to compute the expected key.
Code: Using decorators to cache/evict user access
import { MemoryModelService } from '@travetto/model-memory';
import { Cache, EvictCache, CacheService } from '@travetto/cache';
class User { }
export class UserService {
myCache = new CacheService(new MemoryModelService({ namespace: '' }));
database: {
lookupUser(id: string): Promise<User>;
deleteUser(id: string): Promise<void>;
updateUser(user: User): Promise<User>;
};
@Cache('myCache', '5m', { keySpace: 'user.id' })
async getUser(id: string): Promise<User> {
return this.database.lookupUser(id);
}
@EvictCache('myCache', { keySpace: 'user.id', params: user => [user.id] })
async updateUser(user: User): Promise<void> {
this.database.updateUser(user);
}
@EvictCache('myCache', { keySpace: 'user.id' })
async deleteUser(userId: string): Promise<void> {
this.database.deleteUser(userId);
}
}
By design, the CacheService relies solely on the Data Modeling Support module. Specifically on the Expiry. This combines basic support for CRUD as well as knowledge of how to manage expirable content. Any model service that honors these contracts is a valid candidate to power the CacheService. The CacheService is expecting the model service to be registered using the @travetto/cache:model:
Code: Registering a Custom Model Source
import { InjectableFactory } from '@travetto/di';
import { ModelExpirySupport } from '@travetto/model';
import { MemoryModelService } from '@travetto/model-memory';
import { CacheModelⲐ } from '@travetto/cache';
class Config {
@InjectableFactory(CacheModelⲐ)
static getModel(): ModelExpirySupport {
return new CustomAwesomeModelService({});
}
}
class CustomAwesomeModelService extends MemoryModelService {
// Implement all the things
}