Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/savitar-hub/postgres-backup
Automatise creation of Postgres backups on multiple bucket cloud providers and regions
https://github.com/savitar-hub/postgres-backup
backup google-cloud postgres postgresql postgresql-database
Last synced: about 1 month ago
JSON representation
Automatise creation of Postgres backups on multiple bucket cloud providers and regions
- Host: GitHub
- URL: https://github.com/savitar-hub/postgres-backup
- Owner: Savitar-Hub
- License: mit
- Created: 2023-01-23T21:02:36.000Z (almost 2 years ago)
- Default Branch: main
- Last Pushed: 2023-01-26T20:52:24.000Z (almost 2 years ago)
- Last Synced: 2024-10-10T08:05:09.871Z (about 1 month ago)
- Topics: backup, google-cloud, postgres, postgresql, postgresql-database
- Language: Python
- Homepage:
- Size: 81.1 KB
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 4
-
Metadata Files:
- Readme: README.md
- Contributing: .github/CONTRIBUTING.md
- License: LICENSE.txt
Awesome Lists containing this project
README
# Backup Postgres Database
[![Downloads](https://static.pepy.tech/personalized-badge/postgres-backup?period=month&units=none&left_color=grey&right_color=blue&left_text=Downloads)](https://pepy.tech/project/postgres-backup) ![Version](https://img.shields.io/badge/version-0.3.3-blue) ![Python-Version](https://img.shields.io/badge/python-3.9-blue) ![issues](https://img.shields.io/github/issues/Nil-Andreu/postgres-backup) ![PyPI - Status](https://img.shields.io/pypi/status/postgres-backup) ![License](https://img.shields.io/github/license/Nil-Andreu/postgres-backup)
## Basic Usage
This simple Python package allows you to create easily the database backup of Postgres databases.
You can upload them to cloud storage buckets by creating a cron job.```python
from postgres_backup import Backup# Instantiate the backup object with Postgres database_uri
backup = Backup()# Create the file for backup
backup.create()
```You should have as environment variable `DATABASE_URL`, which is the URI of the Postgres database.
This URI has the following structure: `db:engine:[//[user[:password]@][host][:port]/][dbname]`.Can also specify a list of the tables for which you want to create the backup:
```python
backup.create(table_names=['table1', 'table2', ...])
```## Why?
This package has proved experience of working well for databases of small-mid size.
Doing this, you make sure you can store your database backups without relying in only one cloud provider or region.
## Bucket Storage
Have provided the ability to store those backups in cloud buckets.
### Google Cloud Storage
For using this functionality, you need to install the dependencies needed of the package:
```bash
pip3 install "postgres-backup[gcs]"
```
This basically will install also the `google` package.And then after we have the backup created, we would keep following with:
```python
# Upload it to google cloud storage
backup.upload(
provider=CloudProviders.gcs.value,
)
```Where the `google_cloud_certification` is a dictionary, with the key-values of the client api keys:
```python
google_cloud_credentials = {
"type": "service_account",
"project_id": "xxx-saas",
"private_key_id": "xxxxxxxx",
"private_key": "-----BEGIN PRIVATE KEY-----\nxxxxxxxxxx\n-----END PRIVATE KEY-----\n",
"client_email": "[email protected]",
"client_id": "xxx",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/xxx%xxx-saas.iam.gserviceaccount.com"
}
```Recommended to provide each key as an environmental variable:
- GOOGLE_CLOUD_TYPE -> type
- GOOGLE_CLOUD_PROJECT_ID -> project_id
- GOOGLE_CLOUD_PRIVATE_KEY_ID -> private_key_id
- GOOGLE_CLOUD_PRIVATE_KEY -> private_key
- GOOGLE_CLOUD_CLIENT_EMAIL -> client_email
- GOOGLE_CLOUD_CLIENT_ID -> client_id
- GOOGLE_CLOUD_AUTH_URI -> auth_uri
- GOOGLE_CLOUD_TOKEN_URI -> token_uri
- GOOGLE_CLOUD_AUTH_PROVIDER_X509_CERT_URL -> auth_provider_x509_cert_url
- GOOGLE_CLOUD_CLIENT_X509_CERT_URL -> client_x509_cert_urlMoreover `PROJECT_NAME` and `BUCKET_NAME` of the google bucket, and finally `DATABASE_URL` of Postgres database.
In the case that we do not have a bucket already created for storing the backups, we could add additional parameters to create it:
```python
from postgres_backup.schemas import CloudStorageType, CloudProvidersbackup.upload(
provider=CloudProviders.gcs.value,
bucket_name=bucket_name,
create_bucket=True,
storage_class=CloudStorageType.NEARLINE.value
)
```### Amazon Web Services
For uploading into AWS after having created the backup, you need first to install the optional dependencies:
```bash
pip3 install "postgres-backup[aws]"
```After that, you can use the method of `upload` of the Backup as:
```python
# Upload it to aws storage
backup.upload(
provider=CloudProviders.aws.value,
)
```It requires you to have as environmental variables `AWS_SERVER_PUBLIC_KEY`, `AWS_SERVER_PRIVATE_KEY` and `REGION_NAME`.