Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/chasenicholl/boto3m
A boto3 multiprocessing extension
https://github.com/chasenicholl/boto3m
boto3 multiprocessing
Last synced: about 1 month ago
JSON representation
A boto3 multiprocessing extension
- Host: GitHub
- URL: https://github.com/chasenicholl/boto3m
- Owner: chasenicholl
- Created: 2017-08-04T01:52:49.000Z (over 7 years ago)
- Default Branch: master
- Last Pushed: 2017-08-20T12:30:00.000Z (over 7 years ago)
- Last Synced: 2024-10-10T14:21:50.008Z (3 months ago)
- Topics: boto3, multiprocessing
- Language: Python
- Homepage:
- Size: 9.77 KB
- Stars: 1
- Watchers: 1
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Boto 3M(ultiprocessing) extension.
Boto3m(ultiprocessing) is a simple extension to the Boto3 AWS SDK to support
multiprocess downloading (and uploading coming soon).With support for downloading (and uploading coming soon) objects by folders and keys.
### Requirements
Obviously `boto3` is required. Python >= 3.5 is recommended. (Come on people, lets move on from 2.7)
```
$ pip install boto3
```### Installation
```
$ git clone [email protected]:chasenicholl/boto3m.git
$ cd boto3m && python3 setup.py install
```### Configuring
Configuration follows boto3's pattern. See http://boto3.readthedocs.io/en/latest/guide/configuration.html.
Additionally when using boto3m, 2 optional environment variables my be set.
```python
BOTO3M_WORKERS = 8 # This is the number of parallel processes boto3m can use.
# Defaults to systems CPU count.
BOTO3M_BUCKET = 'some_bucket_on_s3' # The target bucket.
# This can also be passed as a method argument.
```### Usage Example
```python
import logging
from boto3m.s3 import S3M# Set log level to DEBUG to see output
logging.getLogger('boto3m').setLevel(logging.DEBUG)# 1. Download by S3 folder.
# This will automatically find all keys within remote folder.
# Returns a list of downloaded files local location.
files = S3M().download('some/folder/on/s3/',
bucket='my-cool-s3-bucket')# 2. Download by S3 folder to SPECIFIC location
# This will automatically find all keys within remote folder to a SPECIFIC local destination.
# Returns a list of downloaded files local location.
files = S3M().download('some/folder/on/s3/',
dest='/tmp/',
bucket='my-cool-s3-bucket')# 3. Download by S3 keys
# This will download specific S3 keys.
# Returns a list of downloaded files local location.
files = S3M().download(['file1.jpg', 'file2.jpg', 'file3.jpg'],
bucket='my-cool-s3-bucket')```