https://github.com/signiant/s3-sync
Syncs multiple paths from S3 with the local container (or host)
https://github.com/signiant/s3-sync
Last synced: about 1 year ago
JSON representation
Syncs multiple paths from S3 with the local container (or host)
- Host: GitHub
- URL: https://github.com/signiant/s3-sync
- Owner: Signiant
- License: mit
- Created: 2015-06-23T19:48:25.000Z (almost 11 years ago)
- Default Branch: master
- Last Pushed: 2015-12-07T14:12:28.000Z (over 10 years ago)
- Last Synced: 2025-01-16T02:23:40.552Z (over 1 year ago)
- Language: Shell
- Size: 2.93 KB
- Stars: 1
- Watchers: 7
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# s3-sync
Syncs multiple paths from S3 with the local container (or host datavolume)
## Variables
- VERBOSE - enable more logging if set to 1
- FREQUENCY - How often to sync from S3 (in seconds). Default is 300 seconds.
## Config format
The tool to sync the S3 paths reads a config file containing the S3 folder and the local path to sync it to. The format is as follows:
````
( [s3]=S3BUCKET/folder [local]=/my/local/path1 )
( [s3]=S3BUCKET/folder2 [local]=/my/local/path2 )
````
Each S3/local path pair should be on a seperate line.
## Example Docker runs
### On AWS (uses role credentials)
This example mounts the file paths.dat on the docker host into /paths.dat in the container and then passes that as the argument to the s3 sync tool.
````
docker run -d -e "FREQUENCY=600" \
-e "VERBOSE=1" \
-v mylocaldir/paths.dat:/paths.dat \
signiant/s3-sync \
/paths.dat
````
### Outside AWS (needs access/secret key)
This example mounts the file paths.dat on the docker host into /paths.dat in the container and then passes that as the argument to the s3 sync tool.
````
docker run -d -e "FREQUENCY=600" \
-e "VERBOSE=1" \
-e "AWS_ACCESS_KEY_ID=MY_ACCESS_KEY \
-e "AWS_SECRET_ACCESS_KEY=MY_SECRET \
-v mylocaldir/paths.dat:/paths.dat \
signiant/s3-sync \
/paths.dat
````