file backed local work queues
file backed work queues
var filepile = require'filepile';// create a new pile called 'emails'var pile = filepile'emails'// details is whatever job data you provide// call done() when done;// add things to the pile// the will be processed by the function abovepile to: 'firstname.lastname@example.org' body: 'hello' ;
Each filepile will only allow for one consumer but multiple producers. Any named pile which already has a consumer on the same system will not allow another consumer to process jobs. If one of the consumers dies, then another consumer will start.
var cluster = require'cluster';if clusterisMasterfor var i=0 ; i<4 ; ++iclusterfork;return;var filepile = require'filepile';// only one of the processes will invoke this functionvar pile = filepile'emails'console.logdetails;done;;// both processes will generate "work"setIntervalpile foo: 'bar' ;1000;
All producers write json files to a folder in
/tmp. These files are read by the single consumer and processed. A
lockfile is used to ensure that there is only one consumer.
The consumer listens using
fs.watch for new files and processes them.
filepile is not meant to work across machines at this time and will only ensure one consumer with multiple processes on the same machine.
npm install filepile