Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/password123456/s3-file-back-up
A backup script for AWS S3 that uploads local target files to S3.
https://github.com/password123456/s3-file-back-up
aws-s3-bucket bucket-backup s3 s3-backup s3-bucket
Last synced: about 2 months ago
JSON representation
A backup script for AWS S3 that uploads local target files to S3.
- Host: GitHub
- URL: https://github.com/password123456/s3-file-back-up
- Owner: password123456
- License: gpl-3.0
- Created: 2023-06-15T08:32:33.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2023-09-11T00:04:05.000Z (over 1 year ago)
- Last Synced: 2024-11-08T13:41:38.910Z (3 months ago)
- Topics: aws-s3-bucket, bucket-backup, s3, s3-backup, s3-bucket
- Language: Python
- Homepage:
- Size: 26.4 KB
- Stars: 0
- Watchers: 3
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# python backup files to s3
![made-with-python][made-with-python]
![Python Versions][pyversion-button][pyversion-button]: https://img.shields.io/pypi/pyversions/Markdown.svg
[made-with-python]: https://img.shields.io/badge/Made%20with-Python-1f425f.svg# Features
- Provides the functionality to upload a specified local file to Amazon S3.
- Requires configuration of the AWS S3 bucket name (s3_bucket_name), the destination path in S3 (s3_key), and the local file path (local_file_path) before running the script.
- Uploads the configured local file to the specified path in the S3 bucket and displays a successful upload message.
- Create a file named "list.txt" with the following format. Each line represents a log file name and the corresponding directory for S3 backup:
```python
# vim list.txt
ssh_logs,/data/logs/ssh_logs
system_logs,/var/log
```
- When specifying a directory for S3 backup, the script will search for files in the subdirectories as well. It will find the files created yesterday and include them in the backup list.
- The script also checks if the files are already backed up in S3. If a file has already been uploaded, it will not be uploaded again.
- The script separates the files into success and failure categories and outputs the corresponding lists.
- The tqdm library is used to provide a progress bar for the file uploads to S3. This is especially useful for large files, as it allows monitoring the backup progress.
- Once all the operations are completed, the script sends the job details to Slack.
- Feel free to modify and customize the code as per your requirements.- Let me know if there are any changes required or additional features need it.
- and give the "stars" can be helpful for sharing scrub code snippets. then it will continue to improvement.# preview
```bash
[root@buddy2:/]# python main.py
=======================================================
python backup files uploader to s3
=======================================================>> Run time : 2023-06-15 17:51:12.269210
>> Category : system_logs
>> File Count : 4 files
>> TargetFiles:/var/log/2023-06-14/ssh_auth.tar.gz
/var/log/2023-06-14/ssh_xferlog.tar.gz
/var/log/2023-06-14/messages.tar.gz
/var/log/2023-06-14/cron.tar.gzputting the file into S3.....
100%|███████████████████████████████████| 326k/326k [00:12<00:00, 60.6MB/s, ssh_auth.tar.gz]
putting the file into S3.....
100%|███████████████████████████████████| 326k/326k [00:00<00:00, 2.96MB/s, ssh_xferlog.tar.gz]
putting the file into S3.....
100%|███████████████████████████████████| 100M/100M [00:00<00:00, 1.68kB/s, messages.tar.gz]
putting the file into S3.....
100%|███████████████████████████████████| 64.0/64.0 [00:00<00:00, 1.63kB/s, cron.tar.gz]>> Backup Result
1,[ok],success,/var/log/2023-06-14/ssh_auth.tar.gz | 326465 bytes
2,[ok],success,/var/log/2023-06-14/ssh_xferlog.tar.gz | 326465 bytes
3,[ok],success,/var/log/2023-06-14/messages.tar.gz | 102400 bytes
4,[ok],success,/var/log/2023-06-14/cron.tar.gz | 64 bytes------------------------------------->
>> Send to Slack: 200..
...
.....```