https://github.com/chialab/streamlined-backup
Lightweight binary written in Go to upload to S3 a buffer coming from a custom script, for remote backups.
https://github.com/chialab/streamlined-backup
backup-script hacktoberfest portable-executable s3
Last synced: 11 months ago
JSON representation
Lightweight binary written in Go to upload to S3 a buffer coming from a custom script, for remote backups.
- Host: GitHub
- URL: https://github.com/chialab/streamlined-backup
- Owner: chialab
- License: mit
- Created: 2021-10-07T19:48:29.000Z (over 4 years ago)
- Default Branch: main
- Last Pushed: 2022-03-31T16:18:15.000Z (almost 4 years ago)
- Last Synced: 2025-01-12T21:43:02.257Z (about 1 year ago)
- Topics: backup-script, hacktoberfest, portable-executable, s3
- Language: Go
- Homepage:
- Size: 180 KB
- Stars: 3
- Watchers: 3
- Forks: 0
- Open Issues: 4
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
Streamlined backup
==================
 [](https://codecov.io/gh/chialab/streamlined-backup)
This tool is a simple, portable single binary that is able to run backup tasks
at a fixed schedule and upload artifacts to a remote destination.
This tool does not rely on temporary files: the upload task expects a command
that outputs its results as stdout, which is then read and uploaded in chunks
to the remote server as it is produced. This makes this tool ideal in cases where
the filesystem is read only (such as containers) or where there is disk pressure.
Finally, you can pass one or more Slack webhook URLs to the tool to be notified
when the backup is complete, or when it fails.
Example configuration
---------------------
Configuration can be either in JSON or TOML format. The binary expects path to
configuration file to be passed using the `--config` command line argument.
The followind example uses TOML:
```toml
[backup_mysql_database]
schedule = "30 4 * * *"
command = ["/bin/sh", "-c", "mysqldump --single-transaction --column-statistics=0 --set-gtid-purged=off my_database | bzip2"]
[backup_mysql_database.destination]
type = "s3"
[backup_mysql_database.destination.s3]
region = "eu-west-1"
profile = "example-profile"
bucket = "example-bucket"
prefix = "my_database/daily/"
suffix = "-my_database.sql.bz2"
[my_tar_archive]
schedule = "30 4 * * *"
command = ["tar", "-cvjf-", "/path/to/files"]
[my_tar_archive.destination]
type = "s3"
[my_tar_archive.destination.s3]
region = "eu-west-1"
bucket = "example-bucket"
prefix = "my_tar_archive/daily/"
suffix = "-my_tar_archive.tar.bz2"
[my_tar_archive.destination.s3.credentials]
access_key_id = "AKIAIOSFODNN7EXAMPLE"
secret_access_key = "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY"
```