Cache Cow
All the caching modules I found on npm were for caching things in memory, or in an external service like Redis. I couldn't find any cache modules on npm that fit my need, so I wrote this one.
Motivation
- Do you have extra hard drive space?
- Store some of your files in the cloud or whatever?
- Want to save bandwidth or time by making fewer fetches to that cloud?
That is the use case this module is designed for. I wanted to use the hard drive to cache files that are in remote storage like Amazon S3.
Features
- When cache reaches 90% of the defined
maxsize
it deletes files until the cache is 70% ofmaxsize
. - Returns streams, not buffers, for faster requests.
- How it chooses which files to delete can be customized
Free support
Email me or create an issue if you need help! I'm throwing this up without a lot of documentation, and am very welcome to feedback and suggestions.
Example
# Hypothetical example using CacheCow with an HTTP proxy server express = require'express'request = require'request'CacheCow = require'cachecow' app = express cache = maxsize: 1024*1024*1024*10 # 10GB # Define a function to deal with cache misses : consolelog "Not found in cache" stream = request"http://httpstat.us/" callbacknullstream # Retrieve something from cache appget "/:filepath" cacheget reqparamsfilepath return resstatus404senderrmessage if err? streampipe res # This scans the cache directory to gather file size information for any files # already present in the cache directory. cacheinit consolelog err if err? applisten 8080 consolelog err if err? consolelog "started"
Options
# Default values cache = maxsize: 1024*1024*1 # 1MB dir: upathjoinostmpdir'cachecow' : -> consolelog 'You must define a getter function for cache misses!' # Weights used in garbage collection equation DeleteByFileSize: 0 # Delete big files first, keep as many files as possible DeleteByTotalReads: 0 # Delete rarely used files, keep file that are frequently fetched DeleteByLastAccess: 1 # Delete old files, keep files that have been recently fetched