{"id":13393519,"url":"https://github.com/amacneil/dbmate","last_synced_at":"2026-02-02T11:53:46.085Z","repository":{"id":37271237,"uuid":"46880661","full_name":"amacneil/dbmate","owner":"amacneil","description":"🚀 A lightweight, framework-agnostic database migration tool.","archived":false,"fork":false,"pushed_at":"2026-01-12T19:54:54.000Z","size":3015,"stargazers_count":6622,"open_issues_count":41,"forks_count":336,"subscribers_count":29,"default_branch":"main","last_synced_at":"2026-01-13T18:26:26.991Z","etag":null,"topics":["clickhouse","cpp","database","database-migrations","database-schema","docker","go","golang","migration","migrations","mysql","nodejs","postgres","postgresql","python","rust","sqlite"],"latest_commit_sha":null,"homepage":"","language":"Go","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/amacneil.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null,"notice":null,"maintainers":null,"copyright":null,"agents":null,"dco":null,"cla":null}},"created_at":"2015-11-25T18:58:30.000Z","updated_at":"2026-01-13T16:17:15.000Z","dependencies_parsed_at":"2023-09-30T01:33:00.143Z","dependency_job_id":"0b90be9e-bcb6-41aa-a154-890a111ddfdc","html_url":"https://github.com/amacneil/dbmate","commit_stats":{"total_commits":364,"total_committers":57,"mean_commits":6.385964912280702,"dds":"0.32692307692307687","last_synced_commit":"1acb12f064feddfd37f97ff1a6c3f672e4a8765f"},"previous_names":["adrianmacneil/dbmate"],"tags_count":62,"template":false,"template_full_name":null,"purl":"pkg:github/amacneil/dbmate","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/amacneil%2Fdbmate","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/amacneil%2Fdbmate/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/amacneil%2Fdbmate/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/amacneil%2Fdbmate/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/amacneil","download_url":"https://codeload.github.com/amacneil/dbmate/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/amacneil%2Fdbmate/sbom","scorecard":{"id":167893,"data":{"date":"2025-08-11","repo":{"name":"github.com/amacneil/dbmate","commit":"fb55683a63f1990a610c798619966576320c5844"},"scorecard":{"version":"v5.2.1-40-gf6ed084d","commit":"f6ed084d17c9236477efd66e5b258b9d4cc7b389"},"score":5.1,"checks":[{"name":"Dangerous-Workflow","score":10,"reason":"no dangerous workflow patterns detected","details":null,"documentation":{"short":"Determines if the project's GitHub Action workflows avoid dangerous patterns.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#dangerous-workflow"}},{"name":"Maintained","score":10,"reason":"9 commit(s) and 7 issue activity found in the last 90 days -- score normalized to 10","details":null,"documentation":{"short":"Determines if the project is \"actively maintained\".","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#maintained"}},{"name":"Code-Review","score":5,"reason":"Found 10/18 approved changesets -- score normalized to 5","details":null,"documentation":{"short":"Determines if the project requires human code review before pull requests (aka merge requests) are merged.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#code-review"}},{"name":"Binary-Artifacts","score":10,"reason":"no binaries found in the repo","details":null,"documentation":{"short":"Determines if the project has generated executable (binary) artifacts in the source repository.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#binary-artifacts"}},{"name":"CII-Best-Practices","score":0,"reason":"no effort to earn an OpenSSF best practices badge detected","details":null,"documentation":{"short":"Determines if the project has an OpenSSF (formerly CII) Best Practices Badge.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#cii-best-practices"}},{"name":"Token-Permissions","score":0,"reason":"detected GitHub workflow tokens with excessive permissions","details":["Warn: no topLevel permission defined: .github/workflows/ci.yml:1","Warn: no topLevel permission defined: .github/workflows/release.yml:1","Info: no jobLevel write permissions found"],"documentation":{"short":"Determines if the project's workflows follow the principle of least privilege.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#token-permissions"}},{"name":"Pinned-Dependencies","score":0,"reason":"dependency not pinned by hash detected -- score normalized to 0","details":["Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/ci.yml:60: update your workflow using https://app.stepsecurity.io/secureworkflow/amacneil/dbmate/ci.yml/main?enable=pin","Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/ci.yml:62: update your workflow using https://app.stepsecurity.io/secureworkflow/amacneil/dbmate/ci.yml/main?enable=pin","Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/ci.yml:85: update your workflow using https://app.stepsecurity.io/secureworkflow/amacneil/dbmate/ci.yml/main?enable=pin","Warn: third-party GitHubAction not pinned by hash: .github/workflows/ci.yml:91: update your workflow using https://app.stepsecurity.io/secureworkflow/amacneil/dbmate/ci.yml/main?enable=pin","Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/ci.yml:103: update your workflow using https://app.stepsecurity.io/secureworkflow/amacneil/dbmate/ci.yml/main?enable=pin","Warn: third-party GitHubAction not pinned by hash: .github/workflows/ci.yml:106: update your workflow using https://app.stepsecurity.io/secureworkflow/amacneil/dbmate/ci.yml/main?enable=pin","Warn: third-party GitHubAction not pinned by hash: .github/workflows/ci.yml:109: update your workflow using https://app.stepsecurity.io/secureworkflow/amacneil/dbmate/ci.yml/main?enable=pin","Warn: third-party GitHubAction not pinned by hash: .github/workflows/ci.yml:141: update your workflow using https://app.stepsecurity.io/secureworkflow/amacneil/dbmate/ci.yml/main?enable=pin","Warn: third-party GitHubAction not pinned by hash: .github/workflows/ci.yml:148: update your workflow using https://app.stepsecurity.io/secureworkflow/amacneil/dbmate/ci.yml/main?enable=pin","Warn: third-party GitHubAction not pinned by hash: .github/workflows/ci.yml:157: update your workflow using https://app.stepsecurity.io/secureworkflow/amacneil/dbmate/ci.yml/main?enable=pin","Warn: third-party GitHubAction not pinned by hash: .github/workflows/ci.yml:170: update your workflow using https://app.stepsecurity.io/secureworkflow/amacneil/dbmate/ci.yml/main?enable=pin","Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/ci.yml:186: update your workflow using https://app.stepsecurity.io/secureworkflow/amacneil/dbmate/ci.yml/main?enable=pin","Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/ci.yml:188: update your workflow using https://app.stepsecurity.io/secureworkflow/amacneil/dbmate/ci.yml/main?enable=pin","Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/ci.yml:195: update your workflow using https://app.stepsecurity.io/secureworkflow/amacneil/dbmate/ci.yml/main?enable=pin","Warn: third-party GitHubAction not pinned by hash: .github/workflows/ci.yml:224: update your workflow using https://app.stepsecurity.io/secureworkflow/amacneil/dbmate/ci.yml/main?enable=pin","Warn: third-party GitHubAction not pinned by hash: .github/workflows/release.yml:12: update your workflow using https://app.stepsecurity.io/secureworkflow/amacneil/dbmate/release.yml/main?enable=pin","Warn: containerImage not pinned by hash: Dockerfile:2","Warn: containerImage not pinned by hash: Dockerfile:30","Warn: downloadThenRun not pinned by hash: Dockerfile:20-21","Info:   0 out of   7 GitHub-owned GitHubAction dependencies pinned","Info:   0 out of   9 third-party GitHubAction dependencies pinned","Info:   0 out of   2 containerImage dependencies pinned","Info:   0 out of   1 downloadThenRun dependencies pinned","Info:   1 out of   1 npmCommand dependencies pinned"],"documentation":{"short":"Determines if the project has declared and pinned the dependencies of its build process.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#pinned-dependencies"}},{"name":"Security-Policy","score":0,"reason":"security policy file not detected","details":["Warn: no security policy file detected","Warn: no security file to analyze","Warn: no security file to analyze","Warn: no security file to analyze"],"documentation":{"short":"Determines if the project has published a security policy.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#security-policy"}},{"name":"License","score":10,"reason":"license file detected","details":["Info: project has a license file: LICENSE:0","Info: FSF or OSI recognized license: MIT License: LICENSE:0"],"documentation":{"short":"Determines if the project has defined a license.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#license"}},{"name":"Fuzzing","score":0,"reason":"project is not fuzzed","details":["Warn: no fuzzer integrations found"],"documentation":{"short":"Determines if the project uses fuzzing.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#fuzzing"}},{"name":"Branch-Protection","score":-1,"reason":"internal error: error during branchesHandler.setup: internal error: githubv4.Query: Resource not accessible by integration","details":null,"documentation":{"short":"Determines if the default and release branches are protected with GitHub's branch protection settings.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#branch-protection"}},{"name":"Signed-Releases","score":0,"reason":"Project has not signed or included provenance with any releases.","details":["Warn: release artifact v2.28.0 not signed: https://api.github.com/repos/amacneil/dbmate/releases/234631464","Warn: release artifact v2.27.0 not signed: https://api.github.com/repos/amacneil/dbmate/releases/215537031","Warn: release artifact v2.26.0 not signed: https://api.github.com/repos/amacneil/dbmate/releases/201264999","Warn: release artifact v2.25.0 not signed: https://api.github.com/repos/amacneil/dbmate/releases/196010339","Warn: release artifact v2.24.2 not signed: https://api.github.com/repos/amacneil/dbmate/releases/191721996","Warn: release artifact v2.28.0 does not have provenance: https://api.github.com/repos/amacneil/dbmate/releases/234631464","Warn: release artifact v2.27.0 does not have provenance: https://api.github.com/repos/amacneil/dbmate/releases/215537031","Warn: release artifact v2.26.0 does not have provenance: https://api.github.com/repos/amacneil/dbmate/releases/201264999","Warn: release artifact v2.25.0 does not have provenance: https://api.github.com/repos/amacneil/dbmate/releases/196010339","Warn: release artifact v2.24.2 does not have provenance: https://api.github.com/repos/amacneil/dbmate/releases/191721996"],"documentation":{"short":"Determines if the project cryptographically signs release artifacts.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#signed-releases"}},{"name":"Packaging","score":10,"reason":"packaging workflow detected","details":["Info: Project packages its releases by way of GitHub Actions.: .github/workflows/ci.yml:98"],"documentation":{"short":"Determines if the project is published as a package that others can easily download, install, easily update, and uninstall.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#packaging"}},{"name":"SAST","score":0,"reason":"SAST tool is not run on all commits -- score normalized to 0","details":["Warn: 0 commits out of 30 are checked with a SAST tool"],"documentation":{"short":"Determines if the project uses static code analysis.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#sast"}},{"name":"Vulnerabilities","score":10,"reason":"0 existing vulnerabilities detected","details":null,"documentation":{"short":"Determines if the project has open, known unfixed vulnerabilities.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#vulnerabilities"}}]},"last_synced_at":"2025-08-16T15:26:25.372Z","repository_id":37271237,"created_at":"2025-08-16T15:26:25.372Z","updated_at":"2025-08-16T15:26:25.372Z"},"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":28408709,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-01-14T01:52:23.358Z","status":"online","status_checked_at":"2026-01-14T02:00:06.678Z","response_time":107,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["clickhouse","cpp","database","database-migrations","database-schema","docker","go","golang","migration","migrations","mysql","nodejs","postgres","postgresql","python","rust","sqlite"],"created_at":"2024-07-30T17:00:54.939Z","updated_at":"2026-01-16T03:27:27.440Z","avatar_url":"https://github.com/amacneil.png","language":"Go","readme":"# Dbmate\n\n[![Release](https://img.shields.io/github/release/amacneil/dbmate.svg)](https://github.com/amacneil/dbmate/releases)\n[![Go Report](https://goreportcard.com/badge/github.com/amacneil/dbmate)](https://goreportcard.com/report/github.com/amacneil/dbmate)\n[![Reference](https://img.shields.io/badge/go.dev-reference-blue?logo=go\u0026logoColor=white)](https://pkg.go.dev/github.com/amacneil/dbmate/v2/pkg/dbmate)\n\nDbmate is a database migration tool that will keep your database schema in sync across multiple developers and your production servers.\n\nIt is a standalone command line tool that can be used with Go, Node.js, Python, Ruby, PHP, Rust, C++, or any other language or framework you are using to write database-backed applications. This is especially helpful if you are writing multiple services in different languages, and want to maintain some sanity with consistent development tools.\n\nFor a comparison between dbmate and other popular database schema migration tools, please see [Alternatives](#alternatives).\n\n## Table of Contents\n\n- [Features](#features)\n- [Installation](#installation)\n- [Commands](#commands)\n  - [Command Line Options](#command-line-options)\n- [Usage](#usage)\n  - [Connecting to the Database](#connecting-to-the-database)\n    - [PostgreSQL](#postgresql)\n    - [MySQL](#mysql)\n    - [SQLite](#sqlite)\n    - [ClickHouse](#clickhouse)\n    - [BigQuery](#bigquery)\n    - [Spanner](#spanner)\n  - [Creating Migrations](#creating-migrations)\n  - [Running Migrations](#running-migrations)\n  - [Rolling Back Migrations](#rolling-back-migrations)\n  - [Migration Options](#migration-options)\n  - [Waiting For The Database](#waiting-for-the-database)\n  - [Exporting Schema File](#exporting-schema-file)\n- [Library](#library)\n  - [Use dbmate as a library](#use-dbmate-as-a-library)\n  - [Embedding migrations](#embedding-migrations)\n- [Concepts](#concepts)\n  - [Migration files](#migration-files)\n  - [Schema file](#schema-file)\n  - [Schema migrations table](#schema-migrations-table)\n- [Alternatives](#alternatives)\n- [Contributing](#contributing)\n\n## Features\n\n- Supports MySQL, PostgreSQL, SQLite, and ClickHouse\n- Uses plain SQL for writing schema migrations\n- Migrations are timestamp-versioned, to avoid version number conflicts with multiple developers\n- Migrations are run atomically inside a transaction\n- Supports creating and dropping databases (handy in development/test)\n- Supports saving a `schema.sql` file to easily diff schema changes in git\n- Database connection URL is defined using an environment variable (`DATABASE_URL` by default), or specified on the command line\n- Built-in support for reading environment variables from your `.env` file\n- Easy to distribute, single self-contained binary\n- Doesn't try to upsell you on a SaaS service\n\n## Installation\n\n**NPM**\n\nInstall using [NPM](https://www.npmjs.com/):\n\n```sh\nnpm install --save-dev dbmate\nnpx dbmate --help\n```\n\n**macOS**\n\nInstall using [Homebrew](https://brew.sh/):\n\n```sh\nbrew install dbmate\ndbmate --help\n```\n\n**Linux**\n\nInstall the binary directly:\n\n```sh\nsudo curl -fsSL -o /usr/local/bin/dbmate https://github.com/amacneil/dbmate/releases/latest/download/dbmate-linux-amd64\nsudo chmod +x /usr/local/bin/dbmate\n/usr/local/bin/dbmate --help\n```\n\n**Windows**\n\nInstall using [Scoop](https://scoop.sh)\n\n```pwsh\nscoop install dbmate\ndbmate --help\n```\n\n**Docker**\n\nDocker images are published to GitHub Container Registry ([`ghcr.io/amacneil/dbmate`](https://ghcr.io/amacneil/dbmate)).\n\nRemember to set `--network=host` or see [this comment](https://github.com/amacneil/dbmate/issues/128#issuecomment-615924611) for more tips on using dbmate with docker networking):\n\n```sh\ndocker run --rm -it --network=host ghcr.io/amacneil/dbmate --help\n```\n\nIf you wish to create or apply migrations, you will need to use Docker's [bind mount](https://docs.docker.com/storage/bind-mounts/) feature to make your local working directory (`pwd`) available inside the dbmate container:\n\n```sh\ndocker run --rm -it --network=host -v \"$(pwd)/db:/db\" ghcr.io/amacneil/dbmate new create_users_table\n```\n\n## Commands\n\n```sh\ndbmate --help    # print usage help\ndbmate new       # generate a new migration file\ndbmate up        # create the database (if it does not already exist) and run any pending migrations\ndbmate create    # create the database\ndbmate drop      # drop the database\ndbmate migrate   # run any pending migrations\ndbmate rollback  # roll back the most recent migration\ndbmate down      # alias for rollback\ndbmate status    # show the status of all migrations (supports --exit-code and --quiet)\ndbmate dump      # write the database schema.sql file\ndbmate load      # load schema.sql file to the database\ndbmate wait      # wait for the database server to become available\n```\n\n### Command Line Options\n\nThe following options are available with all commands. You must use command line arguments in the order `dbmate [global options] command [command options]`. Most options can also be configured via environment variables (and loaded from your `.env` file, which is helpful to share configuration between team members).\n\n- `--url, -u \"protocol://host:port/dbname\"` - specify the database url directly. _(env: `DATABASE_URL`)_\n- `--env, -e \"DATABASE_URL\"` - specify an environment variable to read the database connection URL from.\n- `--env-file \".env\"` - specify an alternate environment variables file(s) to load.\n- `--migrations-dir, -d \"./db/migrations\"` - where to keep the migration files. _(env: `DBMATE_MIGRATIONS_DIR`)_\n- `--migrations-table \"schema_migrations\"` - database table to record migrations in. _(env: `DBMATE_MIGRATIONS_TABLE`)_\n- `--schema-file, -s \"./db/schema.sql\"` - a path to keep the schema.sql file. _(env: `DBMATE_SCHEMA_FILE`)_\n- `--no-dump-schema` - don't auto-update the schema.sql file on migrate/rollback _(env: `DBMATE_NO_DUMP_SCHEMA`)_\n- `--strict` - fail if migrations would be applied out of order _(env: `DBMATE_STRICT`)_\n- `--wait` - wait for the db to become available before executing the subsequent command _(env: `DBMATE_WAIT`)_\n- `--wait-timeout 60s` - timeout for --wait flag _(env: `DBMATE_WAIT_TIMEOUT`)_\n\n## Usage\n\n### Connecting to the Database\n\nDbmate locates your database using the `DATABASE_URL` environment variable by default. If you are writing a [twelve-factor app](http://12factor.net/), you should be storing all connection strings in environment variables.\n\nTo make this easy in development, dbmate looks for a `.env` file in the current directory, and treats any variables listed there as if they were specified in the current environment (existing environment variables take preference, however).\n\nIf you do not already have a `.env` file, create one and add your database connection URL:\n\n```sh\n$ cat .env\nDATABASE_URL=\"postgres://postgres@127.0.0.1:5432/myapp_development?sslmode=disable\"\n```\n\n`DATABASE_URL` should be specified in the following format:\n\n```\nprotocol://username:password@host:port/database_name?options\n```\n\n- `protocol` must be one of `mysql`, `postgres`, `postgresql`, `sqlite`, `sqlite3`, `clickhouse`\n- `username` and `password` must be URL encoded (you will get an error if you use special charactors)\n- `host` can be either a hostname or IP address\n- `options` are driver-specific (refer to the underlying Go SQL drivers if you wish to use these)\n\nDbmate can also load the connection URL from a different environment variable. For example, before running your test suite, you may wish to drop and recreate the test database. One easy way to do this is to store your test database connection URL in the `TEST_DATABASE_URL` environment variable:\n\n```sh\n$ cat .env\nDATABASE_URL=\"postgres://postgres@127.0.0.1:5432/myapp_dev?sslmode=disable\"\nTEST_DATABASE_URL=\"postgres://postgres@127.0.0.1:5432/myapp_test?sslmode=disable\"\n```\n\nYou can then specify this environment variable in your test script (Makefile or similar):\n\n```sh\n$ dbmate -e TEST_DATABASE_URL drop\nDropping: myapp_test\n$ dbmate -e TEST_DATABASE_URL --no-dump-schema up\nCreating: myapp_test\nApplying: 20151127184807_create_users_table.sql\nApplied: 20151127184807_create_users_table.sql in 123µs\n```\n\nAlternatively, you can specify the url directly on the command line:\n\n```sh\n$ dbmate -u \"postgres://postgres@127.0.0.1:5432/myapp_test?sslmode=disable\" up\n```\n\nThe only advantage of using `dbmate -e TEST_DATABASE_URL` over `dbmate -u $TEST_DATABASE_URL` is that the former takes advantage of dbmate's automatic `.env` file loading.\n\n#### PostgreSQL\n\nWhen connecting to Postgres, you may need to add the `sslmode=disable` option to your connection string, as dbmate by default requires a TLS connection (some other frameworks/languages allow unencrypted connections by default).\n\n```sh\nDATABASE_URL=\"postgres://username:password@127.0.0.1:5432/database_name?sslmode=disable\"\n```\n\nA `socket` or `host` parameter can be specified to connect through a unix socket (note: specify the directory only):\n\n```sh\nDATABASE_URL=\"postgres://username:password@/database_name?socket=/var/run/postgresql\"\n```\n\nA `search_path` parameter can be used to specify the [current schema](https://www.postgresql.org/docs/13/ddl-schemas.html#DDL-SCHEMAS-PATH) while applying migrations, as well as for dbmate's `schema_migrations` table.\nIf the schema does not exist, it will be created automatically. If multiple comma-separated schemas are passed, the first will be used for the `schema_migrations` table.\n\n```sh\nDATABASE_URL=\"postgres://username:password@127.0.0.1:5432/database_name?search_path=myschema\"\n```\n\n```sh\nDATABASE_URL=\"postgres://username:password@127.0.0.1:5432/database_name?search_path=myschema,public\"\n```\n\n#### MySQL\n\n```sh\nDATABASE_URL=\"mysql://username:password@127.0.0.1:3306/database_name\"\n```\n\nA `socket` parameter can be specified to connect through a unix socket:\n\n```sh\nDATABASE_URL=\"mysql://username:password@/database_name?socket=/var/run/mysqld/mysqld.sock\"\n```\n\n#### SQLite\n\nSQLite databases are stored on the filesystem, so you do not need to specify a host. By default, files are relative to the current directory. For example, the following will create a database at `./db/database.sqlite3`:\n\n```sh\nDATABASE_URL=\"sqlite:db/database.sqlite3\"\n```\n\nTo specify an absolute path, add a forward slash to the path. The following will create a database at `/tmp/database.sqlite3`:\n\n```sh\nDATABASE_URL=\"sqlite:/tmp/database.sqlite3\"\n```\n\nNote that for some common [settings](https://sqlite.org/pragma.html) like `journal_mode` to improve performance, transactions need to be disabled for that migration file, e.g.\n\n```sql\n-- migrate:up transaction:false\nPRAGMA journal_mode = WAL;\n```\n\nOtherwise the migration will fail with \"Error: cannot change into wal mode from within a transaction\".\n\n#### ClickHouse\n\n```sh\nDATABASE_URL=\"clickhouse://username:password@127.0.0.1:9000/database_name\"\n```\n\nTo work with ClickHouse cluster, there are 4 connection query parameters that can be supplied:\n\n- `on_cluster` - Indicataion to use cluster statements and replicated migration table. (default: `false`) If this parameter is not supplied, other cluster related query parameters are ignored.\n\n```sh\nDATABASE_URL=\"clickhouse://username:password@127.0.0.1:9000/database_name?on_cluster\"\n\nDATABASE_URL=\"clickhouse://username:password@127.0.0.1:9000/database_name?on_cluster=true\"\n```\n\n- `cluster_macro` (Optional) - Macro value to be used for ON CLUSTER statements and for the replciated migration table engine zookeeper path. (default: `{cluster}`)\n\n```sh\nDATABASE_URL=\"clickhouse://username:password@127.0.0.1:9000/database_name?on_cluster\u0026cluster_macro={my_cluster}\"\n```\n\n- `replica_macro` (Optional) - Macro value to be used for the replica name in the replciated migration table engine. (default: `{replica}`)\n\n```sh\nDATABASE_URL=\"clickhouse://username:password@127.0.0.1:9000/database_name?on_cluster\u0026replica_macro={my_replica}\"\n```\n\n- `zoo_path` (Optional) - The path to the table migration in ClickHouse/Zoo Keeper. (default: `/clickhouse/tables/\u003ccluster_macro\u003e/{table}`)\n\n```sh\nDATABASE_URL=\"clickhouse://username:password@127.0.0.1:9000/database_name?on_cluster\u0026zoo_path=/zk/path/tables\"\n```\n\n[See other supported connection options](https://github.com/ClickHouse/clickhouse-go#dsn).\n\n#### BigQuery\n\nFollow the following format for `DATABASE_URL` when connecting to actual BigQuery in GCP:\n\n```\nbigquery://projectid/location/dataset\n```\n\n`projectid` (mandatory) - Project ID\n\n`dataset` (mandatory) - Dataset name within the Project\n\n`location` (optional) - Where Dataset is created\n\n_NOTE: Follow [this doc](https://cloud.google.com/docs/authentication/provide-credentials-adc) on how to set `GOOGLE_APPLICATION_CREDENTIALS` environment variable for proper Authentication_\n\nFollow the following format if trying to connect to a custom endpoint e.g. [BigQuery Emulator](https://github.com/goccy/bigquery-emulator)\n\n```\nbigquery://host:port/projectid/location/dataset?disable_auth=true\n```\n\n`disable_auth` (optional) - Pass `true` to skip Authentication, use only for testing and connecting to emulator.\n\n#### Spanner\n\nSpanner support is currently limited to databases using the [PostgreSQL Dialect](https://cloud.google.com/spanner/docs/postgresql-interface), which must be chosen during database creation. For future Spanner with GoogleSQL support, see [this discussion](https://github.com/amacneil/dbmate/discussions/369).\n\nSpanner with the Postgres interface requires that the [PGAdapter](https://cloud.google.com/spanner/docs/pgadapter) is running. Use the following format for `DATABASE_URL`, with the host and port set to where the PGAdapter is running:\n\n```shell\nDATABASE_URL=\"spanner-postgres://127.0.0.1:5432/database_name?sslmode=disable\"\n```\n\nNote that specifying a username and password is not necessary, as authentication is handled by the PGAdapter (they will be ignored by the PGAdapter if specified).\n\nOther options of the [postgres driver](#postgresql) are supported.\n\nSpanner also doesn't allow DDL to be executed inside explicit transactions. You must therefore specify `transaction:false` on migrations that include DDL:\n\n```sql\n-- migrate:up transaction:false\nCREATE TABLE ...\n\n-- migrate:down transaction:false\nDROP TABLE ...\n```\n\nSchema dumps are not currently supported, as `pg_dump` uses functions that are not provided by Spanner.\n\n### Creating Migrations\n\nTo create a new migration, run `dbmate new create_users_table`. You can name the migration anything you like. This will create a file `db/migrations/20151127184807_create_users_table.sql` in the current directory:\n\n```sql\n-- migrate:up\n\n-- migrate:down\n```\n\nTo write a migration, simply add your SQL to the `migrate:up` section:\n\n```sql\n-- migrate:up\ncreate table users (\n  id integer,\n  name varchar(255),\n  email varchar(255) not null\n);\n\n-- migrate:down\n```\n\nFor related changes, it is possible to include multiple migrations in a single file using additional `migrate:up` and `migrate:down` sections. Migration file either succeeds or fails as a whole.\n\n```sql\n-- migrate:up\nCREATE TABLE users (id SERIAL PRIMARY KEY);\n\n-- migrate:down\nDROP TABLE users;\n\n-- migrate:up\nALTER TABLE users ADD COLUMN email VARCHAR;\n\n-- migrate:down\nALTER TABLE users DROP COLUMN email;\n```\n\n\u003e Note: Migration files are named in the format `[version]_[description].sql`. Only the version (defined as all leading numeric characters in the file name) is recorded in the database, so you can safely rename a migration file without having any effect on its current application state.\n\n### Running Migrations\n\nRun `dbmate up` to run any pending migrations.\n\n```sh\n$ dbmate up\nCreating: myapp_development\nApplying: 20151127184807_create_users_table.sql\nApplied: 20151127184807_create_users_table.sql in 123µs\nWriting: ./db/schema.sql\n```\n\n\u003e Note: `dbmate up` will create the database if it does not already exist (assuming the current user has permission to create databases). If you want to run migrations without creating the database, run `dbmate migrate`.\n\nPending migrations are always applied in numerical order. However, dbmate does not prevent migrations from being applied out of order if they are committed independently (for example: if a developer has been working on a branch for a long time, and commits a migration which has a lower version number than other already-applied migrations, dbmate will simply apply the pending migration). See [#159](https://github.com/amacneil/dbmate/issues/159) for a more detailed explanation.\n\n### Rolling Back Migrations\n\nBy default, dbmate doesn't know how to roll back a migration. In development, it's often useful to be able to revert your database to a previous state. To accomplish this, implement the `migrate:down` section:\n\n```sql\n-- migrate:up\ncreate table users (\n  id integer,\n  name varchar(255),\n  email varchar(255) not null\n);\n\n-- migrate:down\ndrop table users;\n```\n\nRun `dbmate rollback` to roll back the most recent migration:\n\n```sh\n$ dbmate rollback\nRolling back: 20151127184807_create_users_table.sql\nRolled back: 20151127184807_create_users_table.sql in 123µs\nWriting: ./db/schema.sql\n```\n\n### Migration Options\n\ndbmate supports options passed to a migration block in the form of `key:value` pairs. List of supported options:\n\n- `transaction`\n\n**transaction**\n\n`transaction` is useful if you do not want to run SQL inside a transaction:\n\n```sql\n-- migrate:up transaction:false\nALTER TYPE colors ADD VALUE 'orange' AFTER 'red';\n```\n\n`transaction` will default to `true` if your database supports it.\n\n### Waiting For The Database\n\nIf you use a Docker development environment for your project, you may encounter issues with the database not being immediately ready when running migrations or unit tests. This can be due to the database server having only just started.\n\nIn general, your application should be resilient to not having a working database connection on startup. However, for the purpose of running migrations or unit tests, this is not practical. The `wait` command avoids this situation by allowing you to pause a script or other application until the database is available. Dbmate will attempt a connection to the database server every second, up to a maximum of 60 seconds.\n\nIf the database is available, `wait` will return no output:\n\n```sh\n$ dbmate wait\n```\n\nIf the database is unavailable, `wait` will block until the database becomes available:\n\n```sh\n$ dbmate wait\nWaiting for database....\n```\n\nYou can also use the `--wait` flag with other commands if you sometimes see failures caused by the database not yet being ready:\n\n```sh\n$ dbmate --wait up\nWaiting for database....\nCreating: myapp_development\n```\n\nYou can customize the timeout using `--wait-timeout` (default 60s). If the database is still not available, the command will return an error:\n\n```sh\n$ dbmate --wait-timeout=5s wait\nWaiting for database.....\nError: unable to connect to database: dial tcp 127.0.0.1:5432: connect: connection refused\n```\n\nPlease note that the `wait` command does not verify whether your specified database exists, only that the server is available and ready (so it will return success if the database server is available, but your database has not yet been created).\n\n### Exporting Schema File\n\nWhen you run the `up`, `migrate`, or `rollback` commands, dbmate will automatically create a `./db/schema.sql` file containing a complete representation of your database schema. Dbmate keeps this file up to date for you, so you should not manually edit it.\n\nIt is recommended to check this file into source control, so that you can easily review changes to the schema in commits or pull requests. It's also possible to use this file when you want to quickly load a database schema, without running each migration sequentially (for example in your test harness). However, if you do not wish to save this file, you could add it to your `.gitignore`, or pass the `--no-dump-schema` command line option.\n\nTo dump the `schema.sql` file without performing any other actions, run `dbmate dump`. Unlike other dbmate actions, this command relies on the respective `pg_dump`, `mysqldump`, or `sqlite3` commands being available in your PATH. If these tools are not available, dbmate will silently skip the schema dump step during `up`, `migrate`, or `rollback` actions. You can diagnose the issue by running `dbmate dump` and looking at the output:\n\n```sh\n$ dbmate dump\nexec: \"pg_dump\": executable file not found in $PATH\n```\n\nOn Ubuntu or Debian systems, you can fix this by installing `postgresql-client`, `mysql-client`, or `sqlite3` respectively. Ensure that the package version you install is greater than or equal to the version running on your database server.\n\n\u003e Note: The `schema.sql` file will contain a complete schema for your database, even if some tables or columns were created outside of dbmate migrations.\n\n## Library\n\n### Use dbmate as a library\n\nDbmate is designed to be used as a CLI with any language or framework, but it can also be used as a library in a Go application.\n\nHere is a simple example. Remember to import the driver you need!\n\n```go\npackage main\n\nimport (\n\t\"net/url\"\n\n\t\"github.com/amacneil/dbmate/v2/pkg/dbmate\"\n\t_ \"github.com/amacneil/dbmate/v2/pkg/driver/sqlite\"\n)\n\nfunc main() {\n\tu, _ := url.Parse(\"sqlite:foo.sqlite3\")\n\tdb := dbmate.New(u)\n\n\terr := db.CreateAndMigrate()\n\tif err != nil {\n\t\tpanic(err)\n\t}\n}\n```\n\nSee the [reference documentation](https://pkg.go.dev/github.com/amacneil/dbmate/v2/pkg/dbmate) for more options.\n\n### Embedding migrations\n\nMigrations can be embedded into your application binary using Go's [embed](https://pkg.go.dev/embed) functionality.\n\nUse `db.FS` to specify the filesystem used for reading migrations:\n\n```go\npackage main\n\nimport (\n\t\"embed\"\n\t\"fmt\"\n\t\"net/url\"\n\n\t\"github.com/amacneil/dbmate/v2/pkg/dbmate\"\n\t_ \"github.com/amacneil/dbmate/v2/pkg/driver/sqlite\"\n)\n\n//go:embed db/migrations/*.sql\nvar fs embed.FS\n\nfunc main() {\n\tu, _ := url.Parse(\"sqlite:foo.sqlite3\")\n\tdb := dbmate.New(u)\n\tdb.FS = fs\n\n\tfmt.Println(\"Migrations:\")\n\tmigrations, err := db.FindMigrations()\n\tif err != nil {\n\t\tpanic(err)\n\t}\n\tfor _, m := range migrations {\n\t\tfmt.Println(m.Version, m.FilePath)\n\t}\n\n\tfmt.Println(\"\\nApplying...\")\n\terr = db.CreateAndMigrate()\n\tif err != nil {\n\t\tpanic(err)\n\t}\n}\n```\n\n## Concepts\n\n### Migration files\n\nMigration files are very simple, and are stored in `./db/migrations` by default. You can create a new migration file named `[date]_create_users.sql` by running `dbmate new create_users`.\nHere is an example:\n\n```sql\n-- migrate:up\ncreate table users (\n  id integer,\n  name varchar(255),\n);\n\n-- migrate:down\ndrop table if exists users;\n```\n\nBoth up and down migrations are stored in the same file, for ease of editing. Both up and down directives are required, even if you choose not to implement the down migration.\n\nWhen you apply a migration dbmate only stores the version number, not the contents, so you should always rollback a migration before modifying its contents. For this reason, you can safely rename a migration file without affecting its applied status, as long as you keep the version number intact.\n\n### Schema file\n\nThe schema file is written to `./db/schema.sql` by default. It is a complete dump of your database schema, including any applied migrations, and any other modifications you have made.\n\nThis file should be checked in to source control, so that you can easily compare the diff of a migration. You can use the schema file to quickly restore your database without needing to run all migrations.\n\n### Schema migrations table\n\nDbmate stores a record of each applied migration in table named `schema_migrations`. This table will be created for you automatically if it does not already exist.\n\nThe table is very simple:\n\n```sql\nCREATE TABLE IF NOT EXISTS schema_migrations (\n  version VARCHAR(255) PRIMARY KEY\n)\n```\n\nYou can customize the name of this table using the `--migrations-table` flag or `DBMATE_MIGRATIONS_TABLE` environment variable.\n\n## Alternatives\n\nWhy another database schema migration tool? Dbmate was inspired by many other tools, primarily [Active Record Migrations](http://guides.rubyonrails.org/active_record_migrations.html), with the goals of being trivial to configure, and language \u0026 framework independent. Here is a comparison between dbmate and other popular migration tools.\n\n|                                                              | [dbmate](https://github.com/amacneil/dbmate) | [goose](https://github.com/pressly/goose) | [sql-migrate](https://github.com/rubenv/sql-migrate) | [golang-migrate](https://github.com/golang-migrate/migrate) | [activerecord](http://guides.rubyonrails.org/active_record_migrations.html) | [sequelize](http://docs.sequelizejs.com/manual/tutorial/migrations.html) | [flyway](https://flywaydb.org/) | [sqitch](https://sqitch.org/) |\n| ------------------------------------------------------------ | :------------------------------------------: | :---------------------------------------: | :--------------------------------------------------: | :---------------------------------------------------------: | :-------------------------------------------------------------------------: | :----------------------------------------------------------------------: | :-----------------------------: | :---------------------------: |\n| **Features**                                                 |\n| Plain SQL migration files                                    |              :white_check_mark:              |            :white_check_mark:             |                  :white_check_mark:                  |                     :white_check_mark:                      |                                                                             |                                                                          |       :white_check_mark:        |      :white_check_mark:       |\n| Support for creating and dropping databases                  |              :white_check_mark:              |                                           |                                                      |                                                             |                             :white_check_mark:                              |                                                                          |                                 |\n| Support for saving schema dump files                         |              :white_check_mark:              |                                           |                                                      |                                                             |                             :white_check_mark:                              |                                                                          |                                 |\n| Timestamp-versioned migration files                          |              :white_check_mark:              |            :white_check_mark:             |                                                      |                     :white_check_mark:                      |                             :white_check_mark:                              |                            :white_check_mark:                            |                                 |\n| Custom schema migrations table                               |              :white_check_mark:              |                                           |                  :white_check_mark:                  |                                                             |                                                                             |                            :white_check_mark:                            |       :white_check_mark:        |\n| Ability to wait for database to become ready                 |              :white_check_mark:              |                                           |                                                      |                                                             |                                                                             |                                                                          |                                 |\n| Database connection string loaded from environment variables |              :white_check_mark:              |                                           |                                                      |                                                             |                                                                             |                                                                          |       :white_check_mark:        |\n| Automatically load .env file                                 |              :white_check_mark:              |                                           |                                                      |                                                             |                                                                             |                                                                          |                                 |\n| No separate configuration file                               |              :white_check_mark:              |            :white_check_mark:             |                                                      |                     :white_check_mark:                      |                             :white_check_mark:                              |                            :white_check_mark:                            |                                 |\n| Language/framework independent                               |              :white_check_mark:              |            :white_check_mark:             |                                                      |                     :white_check_mark:                      |                                                                             |                                                                          |       :white_check_mark:        |      :white_check_mark:       |\n| **Drivers**                                                  |\n| PostgreSQL                                                   |              :white_check_mark:              |            :white_check_mark:             |                  :white_check_mark:                  |                     :white_check_mark:                      |                             :white_check_mark:                              |                            :white_check_mark:                            |       :white_check_mark:        |      :white_check_mark:       |\n| MySQL                                                        |              :white_check_mark:              |            :white_check_mark:             |                  :white_check_mark:                  |                     :white_check_mark:                      |                             :white_check_mark:                              |                            :white_check_mark:                            |       :white_check_mark:        |      :white_check_mark:       |\n| SQLite                                                       |              :white_check_mark:              |            :white_check_mark:             |                  :white_check_mark:                  |                     :white_check_mark:                      |                             :white_check_mark:                              |                            :white_check_mark:                            |       :white_check_mark:        |      :white_check_mark:       |\n| CliсkHouse                                                   |              :white_check_mark:              |                                           |                                                      |                     :white_check_mark:                      |                             :white_check_mark:                              |                            :white_check_mark:                            |                                 |\n\n_If you notice any inaccuracies in this table, please [propose a change](https://github.com/amacneil/dbmate/edit/main/README.md)._\n\n## Contributing\n\nDbmate is written in Go, pull requests are welcome.\n\nTests are run against a real database using docker compose. To build a docker image and run the tests:\n\n```sh\n$ make docker-all\n```\n\nTo start a development shell:\n\n```sh\n$ make docker-sh\n```\n","funding_links":[],"categories":["Go","开源类库","Open source library","数据库","sqlite","Database","Repositories","Data Integration Frameworks","UIs"],"sub_categories":["数据库","Database","数据库模式迁移","Database Schema Migration","CLI"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Famacneil%2Fdbmate","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Famacneil%2Fdbmate","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Famacneil%2Fdbmate/lists"}