{"id":13513933,"url":"https://github.com/orgrim/pg_back","last_synced_at":"2026-01-17T21:32:34.960Z","repository":{"id":14945183,"uuid":"17669793","full_name":"orgrim/pg_back","owner":"orgrim","description":"Simple backup tool for PostgreSQL","archived":false,"fork":false,"pushed_at":"2025-12-19T14:20:35.000Z","size":510,"stargazers_count":562,"open_issues_count":8,"forks_count":59,"subscribers_count":14,"default_branch":"master","last_synced_at":"2026-01-12T15:15:02.698Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":"","language":"Go","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"other","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/orgrim.png","metadata":{"files":{"readme":"README.md","changelog":"CHANGELOG.md","contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2014-03-12T14:30:18.000Z","updated_at":"2025-12-30T19:02:16.000Z","dependencies_parsed_at":"2023-11-11T22:23:19.509Z","dependency_job_id":"81f2d6a0-8f80-4c0e-81c5-5c32ff2bd33b","html_url":"https://github.com/orgrim/pg_back","commit_stats":null,"previous_names":[],"tags_count":21,"template":false,"template_full_name":null,"purl":"pkg:github/orgrim/pg_back","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/orgrim%2Fpg_back","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/orgrim%2Fpg_back/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/orgrim%2Fpg_back/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/orgrim%2Fpg_back/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/orgrim","download_url":"https://codeload.github.com/orgrim/pg_back/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/orgrim%2Fpg_back/sbom","scorecard":{"id":712154,"data":{"date":"2025-08-11","repo":{"name":"github.com/orgrim/pg_back","commit":"b26b6a73f1f73ceb2fb264b3d781e63521200fc0"},"scorecard":{"version":"v5.2.1-40-gf6ed084d","commit":"f6ed084d17c9236477efd66e5b258b9d4cc7b389"},"score":5.4,"checks":[{"name":"Code-Review","score":6,"reason":"Found 8/13 approved changesets -- score normalized to 6","details":null,"documentation":{"short":"Determines if the project requires human code review before pull requests (aka merge requests) are merged.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#code-review"}},{"name":"Maintained","score":6,"reason":"7 commit(s) and 1 issue activity found in the last 90 days -- score normalized to 6","details":null,"documentation":{"short":"Determines if the project is \"actively maintained\".","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#maintained"}},{"name":"Dangerous-Workflow","score":10,"reason":"no dangerous workflow patterns detected","details":null,"documentation":{"short":"Determines if the project's GitHub Action workflows avoid dangerous patterns.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#dangerous-workflow"}},{"name":"Binary-Artifacts","score":10,"reason":"no binaries found in the repo","details":null,"documentation":{"short":"Determines if the project has generated executable (binary) artifacts in the source repository.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#binary-artifacts"}},{"name":"Token-Permissions","score":0,"reason":"detected GitHub workflow tokens with excessive permissions","details":["Info: jobLevel 'contents' permission set to 'read': .github/workflows/docker.yml:40","Warn: no topLevel permission defined: .github/workflows/docker.yml:1","Warn: no topLevel permission defined: .github/workflows/go.yml:1","Info: no jobLevel write permissions found"],"documentation":{"short":"Determines if the project's workflows follow the principle of least privilege.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#token-permissions"}},{"name":"Pinned-Dependencies","score":0,"reason":"dependency not pinned by hash detected -- score normalized to 0","details":["Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/docker.yml:26: update your workflow using https://app.stepsecurity.io/secureworkflow/orgrim/pg_back/docker.yml/master?enable=pin","Warn: third-party GitHubAction not pinned by hash: .github/workflows/docker.yml:29: update your workflow using https://app.stepsecurity.io/secureworkflow/orgrim/pg_back/docker.yml/master?enable=pin","Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/docker.yml:45: update your workflow using https://app.stepsecurity.io/secureworkflow/orgrim/pg_back/docker.yml/master?enable=pin","Warn: third-party GitHubAction not pinned by hash: .github/workflows/docker.yml:53: update your workflow using https://app.stepsecurity.io/secureworkflow/orgrim/pg_back/docker.yml/master?enable=pin","Warn: third-party GitHubAction not pinned by hash: .github/workflows/docker.yml:56: update your workflow using https://app.stepsecurity.io/secureworkflow/orgrim/pg_back/docker.yml/master?enable=pin","Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/docker.yml:59: update your workflow using https://app.stepsecurity.io/secureworkflow/orgrim/pg_back/docker.yml/master?enable=pin","Warn: third-party GitHubAction not pinned by hash: .github/workflows/docker.yml:67: update your workflow using https://app.stepsecurity.io/secureworkflow/orgrim/pg_back/docker.yml/master?enable=pin","Warn: third-party GitHubAction not pinned by hash: .github/workflows/docker.yml:75: update your workflow using https://app.stepsecurity.io/secureworkflow/orgrim/pg_back/docker.yml/master?enable=pin","Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/go.yml:18: update your workflow using https://app.stepsecurity.io/secureworkflow/orgrim/pg_back/go.yml/master?enable=pin","Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/go.yml:21: update your workflow using https://app.stepsecurity.io/secureworkflow/orgrim/pg_back/go.yml/master?enable=pin","Warn: containerImage not pinned by hash: Dockerfile:4","Warn: containerImage not pinned by hash: Dockerfile:21","Info:   0 out of   5 GitHub-owned GitHubAction dependencies pinned","Info:   0 out of   5 third-party GitHubAction dependencies pinned","Info:   0 out of   2 containerImage dependencies pinned"],"documentation":{"short":"Determines if the project has declared and pinned the dependencies of its build process.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#pinned-dependencies"}},{"name":"CII-Best-Practices","score":0,"reason":"no effort to earn an OpenSSF best practices badge detected","details":null,"documentation":{"short":"Determines if the project has an OpenSSF (formerly CII) Best Practices Badge.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#cii-best-practices"}},{"name":"Security-Policy","score":0,"reason":"security policy file not detected","details":["Warn: no security policy file detected","Warn: no security file to analyze","Warn: no security file to analyze","Warn: no security file to analyze"],"documentation":{"short":"Determines if the project has published a security policy.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#security-policy"}},{"name":"License","score":9,"reason":"license file detected","details":["Info: project has a license file: LICENSE:0","Warn: project license file does not contain an FSF or OSI license."],"documentation":{"short":"Determines if the project has defined a license.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#license"}},{"name":"Fuzzing","score":0,"reason":"project is not fuzzed","details":["Warn: no fuzzer integrations found"],"documentation":{"short":"Determines if the project uses fuzzing.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#fuzzing"}},{"name":"Signed-Releases","score":8,"reason":"5 out of the last 5 releases have a total of 5 signed artifacts.","details":["Info: signed release artifact: checksums.txt.asc: https://github.com/orgrim/pg_back/releases/tag/v2.6.0","Info: signed release artifact: checksums.txt.asc: https://github.com/orgrim/pg_back/releases/tag/v2.5.0","Info: signed release artifact: checksums.txt.asc: https://github.com/orgrim/pg_back/releases/tag/v2.4.0","Info: signed release artifact: checksums.txt.asc: https://github.com/orgrim/pg_back/releases/tag/v2.3.1","Info: signed release artifact: checksums.txt.asc: https://github.com/orgrim/pg_back/releases/tag/v2.3.0","Warn: release artifact v2.6.0 does not have provenance: https://api.github.com/repos/orgrim/pg_back/releases/235448609","Warn: release artifact v2.5.0 does not have provenance: https://api.github.com/repos/orgrim/pg_back/releases/173981975","Warn: release artifact v2.4.0 does not have provenance: https://api.github.com/repos/orgrim/pg_back/releases/161665851","Warn: release artifact v2.3.1 does not have provenance: https://api.github.com/repos/orgrim/pg_back/releases/160442452","Warn: release artifact v2.3.0 does not have provenance: https://api.github.com/repos/orgrim/pg_back/releases/154370443"],"documentation":{"short":"Determines if the project cryptographically signs release artifacts.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#signed-releases"}},{"name":"Branch-Protection","score":-1,"reason":"internal error: error during branchesHandler.setup: internal error: githubv4.Query: Resource not accessible by integration","details":null,"documentation":{"short":"Determines if the default and release branches are protected with GitHub's branch protection settings.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#branch-protection"}},{"name":"Packaging","score":10,"reason":"packaging workflow detected","details":["Info: Project packages its releases by way of GitHub Actions.: .github/workflows/docker.yml:37"],"documentation":{"short":"Determines if the project is published as a package that others can easily download, install, easily update, and uninstall.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#packaging"}},{"name":"SAST","score":0,"reason":"SAST tool is not run on all commits -- score normalized to 0","details":["Warn: 0 commits out of 28 are checked with a SAST tool"],"documentation":{"short":"Determines if the project uses static code analysis.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#sast"}},{"name":"Vulnerabilities","score":8,"reason":"2 existing vulnerabilities detected","details":["Warn: Project is vulnerable to: GO-2022-0635","Warn: Project is vulnerable to: GO-2022-0646"],"documentation":{"short":"Determines if the project has open, known unfixed vulnerabilities.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#vulnerabilities"}}]},"last_synced_at":"2025-08-22T08:27:21.635Z","repository_id":14945183,"created_at":"2025-08-22T08:27:21.640Z","updated_at":"2025-08-22T08:27:21.640Z"},"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":28518627,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-01-17T18:55:29.170Z","status":"ssl_error","status_checked_at":"2026-01-17T18:55:03.375Z","response_time":85,"last_error":"SSL_read: unexpected eof while reading","robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":false,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-08-01T05:00:40.555Z","updated_at":"2026-01-17T21:32:34.910Z","avatar_url":"https://github.com/orgrim.png","language":"Go","readme":"# pg_back dumps databases from PostgreSQL\n\n## Description\n\npg_back is a dump tool for PostgreSQL. The goal is to dump all or some\ndatabases with globals at once in the format you want, because a simple call to\npg_dumpall only dumps databases in the plain SQL format.\n\nBehind the scene, pg_back uses `pg_dumpall` to dump roles and tablespaces\ndefinitions, `pg_dump` to dump all or each selected database to a separate file\nin the custom format. It also extract database level ACL and configuration that\nis not dumped by pg_dump older than 11. Finally, it dumps all configuration\noptions of the PostgreSQL instance.\n\n## Features\n\n* Dump all or a list of databases\n* Dump all but a list of excluded databases\n* Include database templates\n* Choose the format of the dump for each database\n* Limit dumped schemas and tables\n* Dump databases concurrently\n* Compute a SHA checksum of each dump\n* Pre-backup and post-backup hooks\n* Purge based on age and number of dumps to keep\n* Dump from a hot standby by pausing replication replay\n* Encrypt and decrypt dumps and other files\n* Upload and download dumps to S3, GCS, Azure, B2 or a remote host with SFTP\n\n## Install\n\nA compiled binary is available from the [Github repository](https://github.com/orgrim/pg_back/releases).\n\nThe binary only needs `pg_dumpall` and `pg_dump`.\n\n## Install from source\n\n```\ngo install github.com/orgrim/pg_back@latest\n```\n\nUse `make` to build and install from source (you need go 1.20 or above).\n\nAs an alternative, the following *docker* command downloads, compiles and puts `pg_back`\nin the current directory:\n\n```\ndocker run -u $(id -u) --rm -v \"$PWD\":/go/bin golang:1.20 -v \"$PWD/.cache\":/.cache \\\n    go install github.com/orgrim/pg_back@latest\n```\n\n## Minimum versions\n\nThe minimum version of `pg_dump` et `pg_dumpall` required to dump is 8.4. The\noldest tested server version of PostgreSQL is 8.2.\n\n## Usage\n\n### Basic usage\n\nUse the `--help` or `-?` to print the list of available options. To dump all\ndatabases, you only need to give the proper connection options to the PostgreSQL\ninstance and the path to a writable directory to store the dump files.\n\nIf default and command line options are not enough, a configuration file\nmay be provided with `-c \u003cconfigfilename\u003e` (see [pg_back.conf](pg_back.conf)).\n(Note: see below to convert configuration files from version 1.)\n\nIf the default output directory `/var/backups/postgresql` does not exist or has\nimproper ownership for your user, use `-b` to give the path where to store the\nfiles. The path may contain the `{dbname}` keyword, that would be replaced by\nthe name of the database being dumped, this permits to dump each database in\nits own directory.\n\nTo connect to PostgreSQL, use the `-h`, `-p`, `-U` and `-d` options. If you\nneed less known connection options such as `sslcert` and `sslkey`, you can give\na `keyword=value` libpq connection string like `pg_dump` and `pg_dumpall`\naccept with their `-d` option. When using connection strings, backslashes must\nbe escaped (doubled), as well as literal single quotes (used as string\ndelimiters).\n\nThe other command line options let you tweak what is dumped, purged, and how\nit is done. These options can be put in a configuration file. The command line\noptions override configuration options.\n\n### Per-database configuration\n\nPer-database configuration can only be done with a configuration file. The\nconfiguration file uses the `ini` format, global options are in a unspecified\nsection at the top of the file, and database specific options are in a section\nnamed after the database. Per database options override global options of the\nconfiguration file.\n\nIn database sections of the configuration file, a list of schemas or tables can\nbe excluded from or selected in the dump. When using these options, the rules\nof the `-t`, `-T`, `-n` and `-N` of `pg_dump` and pattern rules apply. See the\n[documentation of `pg_dump`][pg_dump].\n\nWhen no databases names are given on the command line, all databases except\ntemplates are dumped. To include templates, use `--with-templates` (`-T`), if\ntemplates are includes from the configuration file, `--without-templates` force\nexclude them.\n\nDatabases can be excluded with `--exclude-dbs` (`-D`), which is a comma separated list\nof database names. If a database is listed on the command line and part of\nexclusion list, exclusion wins.\n\nMultiple databases can be dumped at the same time, by using a number of\nconcurrent `pg_dump` jobs greater than 1 with `--jobs` (`-j`) option. It is different\nthan `--parallel-backup-jobs` (`-J`) that controls the number of sessions used by\n`pg_dump` with the directory format.\n\n### Checksums\n\nA checksum of all output files is computed in a separate file when\n`--checksum-algo` (`-S`) is different than `none`. The possible algorithms are:\n`sha1`, `sha224`, `sha256`, `sha384` and `sha512`. The checksum file is in the\nformat required by _shaXsum_ (`sha1sum`, `sha256sum`, etc.) tools for checking\nwith their `-c` option.\n\n### Purge\n\nOlder dumps can be removed based on their age with `--purge-older-than` (`-P`)\nin days, if no unit is given. Allowed units are the ones understood by the\n`time.ParseDuration` Go function: \"s\" (seconds), \"m\" (minutes), \"h\" (hours) and\nso on.\n\nA number of dump files to keep when purging can also be specified with\n`--purge-min-keep` (`-K`) with the special value `all` to keep everything, thus\navoiding file removal completly. When both `--purge-older-than` and\n`--purge-min-keep` are used, the minimum number of dumps to keep is enforced\nbefore old dumps are removed. This avoids removing all dumps when the time\ninterval is too small.\n\n### Hooks\n\nA command can be run before taking dumps with `--pre-backup-hook`, and after\nwith `--post-backup-hook`. The commands are executed directly, not by a shell,\nrespecting single and double quoted values. Even if some operation fails, the\npost backup hook is executed when present.\n\n### Encryption\n\nAll the files procuded by a run of pg_back can be encrypted using age\n(\u003chttps://age-encryption.org/\u003e an easy to use tool that does authenticated\nencryption of files). Encryption can be done with a passphrase or a key pair.\n\nTo encrypt files with a passphrase, use the `--encrypt` option along with the\n`--cipher-pass` option or `PGBK_CIPHER_PASS` environment variable to specify\nthe passphrase. When `encrypt` is set to true in the configuration file, the\n`--no-encrypt` option allows to disable encryption on the command line. By\ndefault, unencrypted source files are removed when they are successfully\nencrypted. Use the `--encrypt-keep-src` option to keep them or\n`--no-encrypt-keep-src` to force remove them and override the configuration\nfile. If required, checksum of encrypted files are computed.\n\nWhen using keys, use `--cipher-public-key` to encrypt and\n`--cipher-private-key` to decrypt. The value are passed as strings in Bech32\nencoding. The easiest way to create them is to use the `age` tool.\n\nEncrypted files can be decrypted with the correct passphrase or the private key\nand the `--decrypt` option. When `--decrypt` is present on the command line,\ndumps are not performed, instead files are decrypted. Files can also be\ndecrypted with the `age` tool, independently. Decryption of multiple files can\nbe parallelized with the `-j` option. Arguments on the commandline (database\nnames when dumping) are used as shell globs to choose which files to decrypt.\n\n**Please note** that files are written on disk unencrypted in the backup directory,\nbefore encryption and deleted after the encryption operation is complete. This\nmeans that the host running `pg_back` must secure enough to ensure privacy of the\nbackup directory and connections to PostgreSQL.\n\n### Upload to remote locations\n\nAll files produced by a run can be uploaded to a remote location by setting the\n`--upload` option to a value different than `none`. The possible values are\n`s3`, `sftp`, `gcs`, `azure`, `b2` or `none`.\n\nWhen set to `s3`, files are uploaded to AWS S3. The `--s3-*` family of options\ncan be used to tweak the access to the bucket. The `--s3-profile` option only\nreads credentials and basic configuration, s3 specific options are not used.\n\nWhen set to `sftp`, files are uploaded to a remote host using SFTP. The\n`--sftp-*` family of options can be used to setup the access to the host. The\n`PGBK_SSH_PASS` sets the password or decrypts the private key (identity file),\nit is used only when `--sftp-password` is not set (either in the configuration\nfile or on the command line). When an identity file is provided, the password\nis used to decrypt it and the password authentication method is not tried with\nthe server. The only SSH authentication methods used are password and\npublickey. If an SSH agent is available, it is always used.\n\nWhen set to `gcs`, files are uploaded to Google Cloud Storage. The `--gcs-*`\nfamily of options can be used to setup access to the bucket. When `--gcs-keyfile`\nis empty, `GOOGLE_APPLICATION_CREDENTIALS` environment is used.\n\nWhen set to `azure`, files are uploaded to Azure Blob Storage. The `--azure-*`\nfamily of options can be used to setup access to the container. The name of the\ncontainer is mandatory. If the account name is left empty, an anonymous\nconnection is used and the endpoint is used directly: this allows the use of a\nfull URL to the container with a SAS token. When an account is provided, the\nURL is built by prepending the container name to the endpoint and scheme is\nalways `https`. The default endpoint is `blob.core.windows.net`. The\n`AZURE_STORAGE_ACCOUNT` and `AZURE_STORAGE_KEY` are used when `--azure-account`\nand `--azure-key` are not set (on the command line or corresponding options in\nthe configuration file).\n\nWARNING: Azure support is not guaranted because there are no free solutions for\ntesting on it\n\nWhen set to `b2`, files are uploaded to Backblaze B2. The `--b2-*` family of\noptions can be used to tweak the access to the\nbucket. `--b2-concurrent-connections` can be used to upload the file through\nparallel HTTP connections.\n\nThe `--upload-prefix` option can be used to place the files in a remote\ndirectory, as most cloud storage treat prefix as directories. The filename and\nthe prefix is separated by a / in the remote location.\n\nThe `--purge-remote` option can be set to `yes` to apply the same purge policy\non the remote location as the local directory.\n\nWhen files are encrypted and their unencrypted source is kept, only encrypted\nfiles are uploaded.\n\n### Downloading from remote locations\n\nPreviously uploaded files can be downloaded using the `--download` option with\na value different than `none`, similarly to `--upload`. The options to setup\nthe remote access are the same as `--upload`.\n\nIt is possible to only list remote files with `--list-remote` with a value\ndifferent than `none`, similarly to `--upload` and `--download`.\n\nWhen listing or downloading files, dumps are not performed. Arguments on the\ncommandline (database names when dumping) are used as shell globs to\nselect/filter files.\n\nIf `--download` is used at the same time as `--decrypt`, files are downloaded\nfirst, then files matching globs are decrypted.\n\n## Restoring files\n\nThe following files are created:\n\n* `pg_globals_{date}.sql`: definition of roles and tablespaces, dumped with\n  `pg_dumpall -g`. This file is restored with `psql`.\n* `pg_settings_{date}.out`: the list of server parameters found in the\n  configuration files (9.5+) or in the `pg_settings` view. They shall be put\n  back by hand.\n* `ident_file_{date}.out`: the full contents of the `pg_ident.conf` file,\n  usually located in the data directory.\n* `hba_file_{date}.out`: the full contents of the `pg_hba.conf` file, usually\n  located in the data directory.\n* `{dbname}_{date}.createdb.sql`: an SQL file containing the definition of the\n  database and parameters set at the database or \"role in database\" level. It\n  is mostly useful when using a version of `pg_dump` older than 11. It is\n  restored with `psql`.\n* `{dbname}_{date}.{d,sql,dump,tar}`: the dump of the database, with a suffix\n  depending of its format. If the format is plain, the dump is suffixed with\n  `sql` and must be restored with `psql`. Otherwise, it must be restored with\n  `pg_restore`.\n\nWhen checksum are computed, for each file described above, a text file of the\nsame name with a suffix naming the checksum algorithm is produced.\n\nWhen files are encrypted, they are suffixed with `age` and must be decrypted\nfirst, see the [Encryption] section above. When checksums are computed and\nencryption is required, checksum files are encrypted and encrypted files are\nchecksummed.\n\nTo sum up, when restoring:\n\n1. Create the roles and tablespaces by executing `pg_globals_{date}.sql` with `psql`.\n2. Create the database with `{dbname}_{date}.createdb.sql` if necessary.\n3. Restore the database(s) with `pg_restore` (use `-C` to create the database) or `psql`\n\n## Managing the configuration file\n\nThe previous v1 configuration files are not compatible with pg_back v2.\n\nGive the path of the v1 configuration file to the `--convert-legacy-config`\ncommand line option, and pg_back will try its best to convert it to the v2\nformat. Redirect the output to the new configuration file:\n\n```\npg_back --convert-legacy-config  pg_back1.conf \u003e pg_back2.conf\n```\n\nThe default configuration file can be printed with the `--print-default-config`\ncommand line option.\n\nOn some environments (especially Debian), you may have to add `host = /var/run/postgresql`\nto override the default `/tmp` host.\n\n## Testing\n\nUse the Makefile or regular `go test`.\n\nTo run SQL tests requiring a PostgreSQL instance:\n\n1. run `initdb` in some directory\n2. start `postgres`\n3. load `testdata/fixture.sql` with `psql`\n4. use `go test` or `make test` with the `PGBK_TEST_CONNINFO` environment\n   variable set to a libpq connection string pointing to the instance. For\n   example :\n\n```\nPGBK_TEST_CONNINFO=\"host=/tmp port=14651\" make test\n```\n\n## Contributing\n\nPlease use the issues and pull requests features from Github.\n\n## License\n\nPostgreSQL - See [LICENSE][license] file\n\n[license]: https://github.com/orgrim/pg_back/blob/master/LICENSE\n\n[pg_dump]: https://www.postgresql.org/docs/current/app-pgdump.html\n","funding_links":[],"categories":["Go"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Forgrim%2Fpg_back","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Forgrim%2Fpg_back","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Forgrim%2Fpg_back/lists"}