{"id":19964048,"url":"https://github.com/pilotpirxie/database-backup-worker","last_synced_at":"2026-02-02T08:31:48.616Z","repository":{"id":127271289,"uuid":"497300001","full_name":"pilotpirxie/database-backup-worker","owner":"pilotpirxie","description":"💾 Dump MySQL, PostgreSQL or ClickHouse database and upload to S3","archived":false,"fork":false,"pushed_at":"2024-10-31T14:45:13.000Z","size":142,"stargazers_count":3,"open_issues_count":0,"forks_count":0,"subscribers_count":1,"default_branch":"main","last_synced_at":"2025-07-02T00:02:50.910Z","etag":null,"topics":["aws-s3","backup","backup-tool","backup-utility","clickhouse","clickhouse-database","database","mysql","mysql-database","node","nodejs","postgres","postgresql","s3","s3-bucket","s3-storage","sql","worker"],"latest_commit_sha":null,"homepage":"","language":"TypeScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/pilotpirxie.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2022-05-28T11:37:12.000Z","updated_at":"2024-10-31T14:45:17.000Z","dependencies_parsed_at":null,"dependency_job_id":"f8cfee88-a0af-404e-9969-aab81e357580","html_url":"https://github.com/pilotpirxie/database-backup-worker","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/pilotpirxie/database-backup-worker","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pilotpirxie%2Fdatabase-backup-worker","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pilotpirxie%2Fdatabase-backup-worker/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pilotpirxie%2Fdatabase-backup-worker/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pilotpirxie%2Fdatabase-backup-worker/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/pilotpirxie","download_url":"https://codeload.github.com/pilotpirxie/database-backup-worker/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pilotpirxie%2Fdatabase-backup-worker/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":29007958,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-02-02T08:20:25.892Z","status":"ssl_error","status_checked_at":"2026-02-02T08:20:04.345Z","response_time":58,"last_error":"SSL_connect returned=1 errno=0 peeraddr=140.82.121.5:443 state=error: unexpected eof while reading","robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":false,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["aws-s3","backup","backup-tool","backup-utility","clickhouse","clickhouse-database","database","mysql","mysql-database","node","nodejs","postgres","postgresql","s3","s3-bucket","s3-storage","sql","worker"],"created_at":"2024-11-13T02:18:58.626Z","updated_at":"2026-02-02T08:31:48.600Z","avatar_url":"https://github.com/pilotpirxie.png","language":"TypeScript","readme":"# database-backup-worker\n\nBackend worker to periodically dump full **MySQL**, **PostgreSQL** or **ClickHouse** database and upload to S3. You can setup multiple databases on a single instance.\n\n### Getting started\n\n```shell\ngit clone https://github.com/pilotpirxie/database-backup-worker.git\ncd database-backup-worker\nyarn\nyarn build\nyarn start\n```\n\n### Setup\n\nConfig in `.env` file should look like following:\n\n```shell\n# Number of databases to backup\n# For each database setup environment variables prefixed with DB_\n# Index for database config is 0-based\nDB_NUMBER=1\n\n# Whether to run backup on start without\n# waiting for the cron time\nRUN_ON_START=false\n\n# Cron time pattern, use https://crontab.guru/\n# e.g. \"0 */6 * * *\" means \"At minute 0 past every 6th hour.\"\n# leave it default if you don't know what you are doing.\nDB_CRON_PATTERN_\u003cINDEX\u003e=\"0 */6 * * *\"\n\n# Database type \"mysql\", \"clickhouse\" or \"postgresql\"\nDB_TYPE_\u003cINDEX\u003e=\n\n# Database host\n# For MySQL use just host\n# For ClickHouse prefix with https:// or http:// protocol\nDB_HOST_\u003cINDEX\u003e=\n\n# Database port\nDB_PORT_\u003cINDEX\u003e=\n\n# Database name\nDB_NAME_\u003cINDEX\u003e=\n\n# Database user name\nDB_USER_\u003cINDEX\u003e=\n\n# Database user password\nDB_PASS_\u003cINDEX\u003e=\n\n# Skip tables, comma separated\n# e.g. \"table1,table2\"\nDB_SKIP_TABLES_\u003cINDEX\u003e=\n\n# S3 compatible bucket name\nS3_BUCKET=\n\n# S3 access key\nS3_ACCESS_KEY=\n\n# S3 secret key\nS3_SECRET_KEY=\n\n# Endpoint of the service without protocol\nS3_ENDPOINT=\n\n# Use secure connection, recommended\nS3_SSL=true\n\n# Use path-style addressing https://s3.amazonaws.com/BUCKET/KEY\nS3_FORCE_PATH_STYLE=true\n```\n\n### License\n\n```\nMIT\n```\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fpilotpirxie%2Fdatabase-backup-worker","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fpilotpirxie%2Fdatabase-backup-worker","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fpilotpirxie%2Fdatabase-backup-worker/lists"}