punkave/s3-upload-missing

Name: s3-upload-missing

Owner: P'unk Avenue

Description: Upload contents of given folder to given s3 path, recursively. If a file already exists, do not upload it.

Created: 2015-12-11 20:01:50.0

Updated: 2015-12-14 02:43:49.0

Pushed: 2015-12-14 01:01:04.0

Homepage: null

Size: 30

Language: JavaScript

GitHub Committers

UserMost Recent Commit# Commits

Other Committers

UserEmailMost Recent Commit# Commits

README

ovel this folder's contents, recursively, up to mybucket.
 a file is already there, it is not sent again
pload-missing . mybucketname .

 it with verbose output
pload-missing --verbose . mybucketname .

ecify a prefix in the destination bucket.
slash at the end is implied if not provided
pload-missing --verbose . mybucketname uploads

low the public to read the files (web-accessible)
pload-missing --verbose . mybucketname uploads --acl=public-read

 a file is unreadable, chmod it momentarily so we
n read it, then send it to S3 with "private" permissions.
en chmod it back to 000
pload-missing --verbose . mybucketname uploads --acl=public-read --chmod-if-needed

so remove any remote files that do not exist locally.
e with care
pload-missing . mybucketname . --delete

You must populate ~/.aws/credentials with your key and secret, like this:

ault]

access_key_id = xxx

secret_access_key = yyyyyy

TODO: support command line arguments for these as well.

“Why not use s3cmd?” s3cmd works fine, but we have a peculiar need to successfully upload files with permissions 000 and give them the private acl on s3 (the --chmod-if-needed option). This is very useful when transitioning from local files to s3 with uploadfs. Also, s3-upload-missing may be faster.


This work is supported by the National Institutes of Health's National Center for Advancing Translational Sciences, Grant Number U24TR002306. This work is solely the responsibility of the creators and does not necessarily represent the official views of the National Institutes of Health.