Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/dustin/s3up
Bulk S3 Uploader
https://github.com/dustin/s3up
Last synced: 3 days ago
JSON representation
Bulk S3 Uploader
- Host: GitHub
- URL: https://github.com/dustin/s3up
- Owner: dustin
- License: bsd-3-clause
- Created: 2020-12-27T03:25:59.000Z (almost 4 years ago)
- Default Branch: main
- Last Pushed: 2024-10-11T07:27:19.000Z (3 months ago)
- Last Synced: 2024-11-11T13:03:23.939Z (about 1 month ago)
- Language: Haskell
- Size: 73.2 KB
- Stars: 1
- Watchers: 3
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- Changelog: ChangeLog.md
- License: LICENSE
Awesome Lists containing this project
README
# s3up
s3up is a commandline tool that provides resumable multipart uploads
of files to Amazon S3.## Usage
First, you create a multipart upload. e.g., to create the object
`s3/obj` in the S3 bucket `my.s3.bucket` with the contents of
`/some/file`, you first create the multipart upload:s3up create --bucket my.s3.bucket /some/file s3/obj
(and any others you may wish to define) and then run all the
outstanding uploads with the upload command:s3up upload
## Commandline Reference
### create
The `create` command creates a multipart upload at S3 and records the
current state.The `-s` option specifies the chunk size. The default of 6 MB should
be enough in general, but you can go larger if you have a very large
file you need to upload. A chunk size must be at least 5 MB to make
Amazon happy.### upload
The `upload` command processes all outstanding uploads.
Files are processed in order of which have the least work to do (i.e.,
the ones that would finish soonest) and each individual file is
processed concurrently as limited by the `-u` option.### list
The `list` command lists in-progress multi-part upload operations for
a single bucket from S3's perspective.### abort
The `abort` command aborts a multi-part upload at S3 and from s3up's
local concept of in-progress uploads.