Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/gumlet/cloudinary-to-s3
Utility to copy files from Cloudinary to S3 bucket.
https://github.com/gumlet/cloudinary-to-s3
Last synced: 1 day ago
JSON representation
Utility to copy files from Cloudinary to S3 bucket.
- Host: GitHub
- URL: https://github.com/gumlet/cloudinary-to-s3
- Owner: gumlet
- Created: 2020-07-05T08:08:17.000Z (over 4 years ago)
- Default Branch: master
- Last Pushed: 2021-04-29T13:23:56.000Z (over 3 years ago)
- Last Synced: 2024-03-27T03:55:00.200Z (8 months ago)
- Language: Python
- Size: 14.6 KB
- Stars: 0
- Watchers: 5
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Cloudinary-to-S3
Utility script to copy files from Cloudinary to S3 bucket.
## Requirements
- python >= 3.6
- python-pip3## Installation
To install necessary packages for script, run following command.
`pip3 install -r requirements.txt`
## Usage
This script requires following arguments (necessary to run):
- Cloudinary Cloud Name
- Cloudinary API key
- Cloudinary API Secret
- S3 Endpoint URL
- S3 Access key
- S3 secret key
- S3 Bucket nameFor example,
```bash
python3 storage_migration.py \
--cloudinary_cloud_name cloudinary_cloud name \
--cloudinary_api_key cloudinary_api_key \
--cloudinary_api_secret cloudinary_api_secret \
--s3_endpoint_url s3_endpoint_url \
--s3_access_key_id s3_access_key \
--s3_secret_access_key s3_secret_key \
--s3_bucket_name s3_bucket_name
```If you are re-running this script then use following argument to speedup the performance.
`--resuming_migration True`
To update te max no of worker (which handles migration process in paraller) use the following argument (default is 25).
`--max_worker 50`
For more info, run following commmand.
```bash
python3 storage_migration.py --help
```