copy files to S3
pushup is a Transform stream to which you write filenames of files you want to copy to S3. If you export your S3 credentials to your process environment, you can neglect the options object.
var pushup = require'pushup'var push = pushuppushwrite'/some/file'pushwrite'/some/other/file'pushend
Options let you control compression, caching, and the root of the path in the bucket.
return 'file': 3600 * 24 * 30returngzip: gzipttl: ttlroot: '/some'var pushup = require'pushup'var push = pushupoptspushwrite'/some/file'pushwrite'/some/other/file'pushend
An optional bag of settings to toggle gzip compression by filename or extension as
Boolean() (defaults to
true). pushup compresses text files before they get uploaded using gzip and sets proper
content/encoding headers. The following would compress all but xml:
Optional settings to configure
Cache-Control headers by filename or extension in (max-age) delta-seconds. For example:
'.html': 3600 * 24 * 30 '.css': 3600 * 24 * 30 'hot.html': 3600
root option you can control the root of the replicated file tree in your bucket. For example:
var pushup = require'pushup'var push = pushuproot:'/some'pushwrite'/some/file'pushwrite'/some/other/file'pushend
This would copy the files to
/other/file in your S3 bucket. If
undefined or your defined
root is not part of the given file path, the entire path will be replicated.
Object passed to the
Transform stream constructor.
A Transform stream that consumes filenames and emits paths of files copied to S3 using knox.
With npm do:
$ npm install pushup