node package manager


Uploads (large) data via pipe to Google Drive

Cloud Storage Uploader

Uploads (large) data trough the pipe to Google Drive.

Very basic and it's only working with GDrive for now.

You can use stdin / pipe as source only. There won't be any feature to send a file by parameter in near future.

  $ (sudo) npm install -g csup

Take a quick look at Configuration and Authentication (below) before using it.

  $ csup setup
  $ csup help
  Usage: csup (switch) (-option|--option)
    auth        receives `accessToken` from Google API (interactive)
    setup       setups `clientID` + `clientSecret` (interactive)
    -h --help   displays help
    -n --name   filename for cloud storage    e.g. -n filename.txt
    -t --type   force a specific filetype     e.g. -t 'application/zip'
    -v -vv -vvv verbosity

Process (large) data and pipe them to cloud storage and returns the downloadUrl if succeeded:

  $ do_something | do_some_other_stuff | … | csup -n output.txt

Uploading a log file and zip it:

  $ cat /var/log/service.log | grep error | gzip | csup -n "log.gz"

If you prefere more verbosity:

  $ cat /var/log/service.log | grep error | gzip | csup -v -n "log.gz"
  0B_aNw316e3FwdXEwXEdCMnlVaW8  log.gz  1.2mb

If you prefer more verbosity:

  $ cat /var/log/service.log | grep error | gzip | csup -vv -n "log.gz"
  { id: '0B_aNw316e3FwdXEwXEdCMnlVaW8',
  filename: 'log.gz',
  mimeType: 'application/x-gzip; charset=UTF-8',
  downloadUrl: '…?h=…&e=download&gd=true',
  createdDate: '2014-04-14T12:36:40.200Z',
  modifiedDate: '2014-04-14T12:36:40.021Z',
  md5Checksum: '92e4e5e7834dc754186f07c8e868dbf9',
  fileSize: 1234567,
  originalFilename: 'Untitled',
  ownerNames: [ 'OwnerName' ] }

Sending a large videofile could be

  $ cat video.mkv | csup

With giving a filename (recommend):

  $ cat james_bond.mkv | csup -n JamesBond.mkv

Force a specific filetype:

  $ cat | csup -n -t text/troff

With url you can download the file:

  $ csup > myfile.txt

Up- and download a file in one step:

  $ cat file.json | ./bin/csup -n file.json | xargs -0 -I url ./bin/csup url > downloaded_file.json

Tar and compress a folder, encrypt it and send it directly to your Google Drive:

  $ tar cz folder_to_encrypt | openssl enc -aes-256-cbc -e -pass pass:mypass | csup -n backup_$(date +"%Y-%m-%d_%H:%M:%S_%Z").tar.gz.enc

Encrypting and deflating would be:

  $ openssl enc -d -aes-256-cbc -in out.tar.gz.enc -pass pass:mypass | out.tar.gz | tar xz

I'll figure out an example the next weeks, so far take look at

According to Google Drive support you are able to upload up to 1TB large files if you own that much space.

MIT License