{"id":13623492,"url":"https://github.com/webfactory/slimdump","last_synced_at":"2025-04-12T21:20:17.600Z","repository":{"id":23589725,"uuid":"26958209","full_name":"webfactory/slimdump","owner":"webfactory","description":"A tool for creating configurable dumps of large MySQL-databases.","archived":false,"fork":false,"pushed_at":"2024-08-27T07:35:25.000Z","size":340,"stargazers_count":185,"open_issues_count":9,"forks_count":27,"subscribers_count":12,"default_branch":"master","last_synced_at":"2025-04-04T00:10:05.376Z","etag":null,"topics":["database","mysql","php"],"latest_commit_sha":null,"homepage":"","language":"PHP","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/webfactory.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2014-11-21T11:52:44.000Z","updated_at":"2025-01-11T21:00:52.000Z","dependencies_parsed_at":"2024-11-07T19:36:24.044Z","dependency_job_id":null,"html_url":"https://github.com/webfactory/slimdump","commit_stats":{"total_commits":224,"total_committers":22,"mean_commits":"10.181818181818182","dds":0.7366071428571428,"last_synced_commit":"481b94af21ebd383cd5e3c3023622cad6a1f7c9c"},"previous_names":[],"tags_count":31,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/webfactory%2Fslimdump","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/webfactory%2Fslimdump/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/webfactory%2Fslimdump/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/webfactory%2Fslimdump/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/webfactory","download_url":"https://codeload.github.com/webfactory/slimdump/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":248632223,"owners_count":21136645,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["database","mysql","php"],"created_at":"2024-08-01T21:01:32.418Z","updated_at":"2025-04-12T21:20:17.529Z","avatar_url":"https://github.com/webfactory.png","language":"PHP","readme":"![webfactory Logo](https://www.webfactory.de/bundles/webfactorytwiglayout/img/logo.png) slimdump\n========\n\n[![Build Status](https://github.com/webfactory/slimdump/workflows/Run%20Tests/badge.svg)](https://github.com/webfactory/slimdump/actions)\n[![Coverage Status](https://coveralls.io/repos/webfactory/slimdump/badge.svg?branch=master\u0026service=github)](https://coveralls.io/github/webfactory/slimdump?branch=master)\n![](https://github.com/webfactory/slimdump/workflows/AllDependenciesDeclared/badge.svg)\n\n`slimdump` is a little tool to help you create configurable dumps of large MySQL-databases. It works off one or several configuration files. For every table you specify, it can dump only the schema (`CREATE TABLE ...` statement), full table data, data without blobs and more.\n\n## Why?\n\nWe created `slimdump` because we often need to dump parts of MySQL databases in a convenient and reproducible way. Also, when you need to analyze problems with data from your production databases, you might want to pull only relevant parts of data and hide personal data (user names, for example).\n\n`mysqldump` is a great tool, probably much more proven when it comes to edge cases and with a lot of switches. But there is no easy way to create a simple configuration file that describes a particular type of dump (e.g. a subset of your tables) and share it with your co-workers. Let alone dumping tables and omitting BLOB type columns.\n\n## Installation\n\nWhen PHP is your everyday programming language, you probably have [Composer](https://getcomposer.org) installed. You can then easily install `slimdump` as a [global package](https://getcomposer.org/doc/03-cli.md#global). Just run `composer global require webfactory/slimdump`. In order to use it like any other Unix command, make sure `$COMPOSER_HOME/vendor/bin` is in your `$PATH`.\n\nOf course, you can also add `slimdump` as a local (per-project) Composer dependency.\n\nWe're also working on providing a `.phar` package of `slimdump` for those not using PHP regularly. With that solution, all you need is to have the PHP interpreter installed and to download a single archive file to use `slimdump`. You can help us and open a pull request for that :-)!\n\n## Usage\n`slimdump` needs the DSN for the database to dump and one or more config files:\n\n`slimdump {DSN} {config-file} [...more config files...]`\n\n`slimdump` writes to STDOUT. If you want your dump written to a file, just redirect the output:\n\n`slimdump {DSN} {config-file} \u003e dump.sql`\n\n\nIf you want to use an environment variable for the DSN, replace the first parameter with `-`:\n\n`MYSQL_DSN={DSN} slimdump - {config file(s)}`\n\nThe DSN has to be in the following format:\n\n`mysql://[user[:password]@]host[:port]/dbname[?charset=utf8mb4]`\n\nFor further explanations have a look at the [Doctrine documentation](https://www.doctrine-project.org/projects/doctrine-dbal/en/current/reference/configuration.html#connecting-using-a-url).\n\n### Optional parameters and command line switches\n\n#### no-progress\n\nThis turns off printing some progress information on `stderr`. Useful in scripting contexts.\n\nExample:\n`slimdump --no-progress {DSN} {config-file}`\n\n#### buffer-size\n\nYou can also specify the buffer size, which can be useful on shared environments where your `max_allowed_packet` is low.\nDo this by using the optional cli-option `buffer-size`. Add a suffix (KB, MB or GB) to the value for better readability.\n\nExample:\n`slimdump --buffer-size=16MB {DSN} {config-file}`\n\n#### single-line-insert-statements\n\nIf you have tables with a large number of rows to dump and you are not planning to keep your dumps under version\ncontrol, you might consider writing each `INSERT INTO`-statement to a single line instead of one line per row. You can\ndo this by using the cli-parameter `single-line-insert-statements`. This can speed up the import significantly.\n\nExample:\n`slimdump --single-line-insert-statements {DSN} {config-file}`\n\n#### output-csv\n\nThis option turns on the CSV (comma separated values) output mode. It must be given the path to a directory where `.csv` files will be created. The files are named according to tables, e. g. `my_table.csv`.\n\nCSV files contain only data. They are not created for views, triggers, or tables dumped with the `schema` dump mode. Also, no files will be created for empty tables.\n\nSince this output format needs to write to different files for different tables, redirecting `stdout` output (as can be done for the default MySQL SQL mode) is not possible.\n\n**Experimental Feature** CSV support is a new, [experimental feature](https://github.com/webfactory/slimdump/pull/92). The output formatting may change at any time.  \n\n## Configuration\nConfiguration is stored in XML format somewhere in your filesystem. As a benefit, you could add the configuration to your repository to share a quickstart to your database dump with your coworkers.\n\nExample:\n```xml\n\u003c?xml version=\"1.0\" ?\u003e\n\u003cslimdump\u003e\n  \u003c!-- Create a full dump (schema + data) of \"some_table\" --\u003e\n  \u003ctable name=\"some_table\" dump=\"full\" /\u003e\n\n  \u003c!-- Dump the \"media\" table, omit BLOB fields. --\u003e\n  \u003ctable name=\"media\" dump=\"noblob\" /\u003e\n\n  \u003c!-- Dump the \"user\" table, hide names and email addresses. --\u003e\n  \u003ctable name=\"user\" dump=\"full\"\u003e\n      \u003ccolumn name=\"username\" dump=\"masked\" /\u003e\n      \u003ccolumn name=\"email\" dump=\"masked\" /\u003e\n      \u003ccolumn name=\"password\" dump=\"replace\" replacement=\"test\" /\u003e\n  \u003c/table\u003e\n\n  \u003c!-- Dump the \"document\" table but do not pass the \"AUTO_INCREMENT\" parameter to the SQL query.\n       Instead start to increment from the beginning --\u003e\n  \u003ctable name=\"document\" dump=\"full\" keep-auto-increment=\"false\" /\u003e\n\n  \u003c!--\n    Trigger handling:\n\n    By default, CREATE TRIGGER statements will be dumped for all tables, but the \"DEFINER=...\"\n    clause will be removed to make it easier to re-import the database e. g. in development\n    environments.\n\n    You can change this by setting 'dump-triggers' to one of:\n        - 'false' or 'none': Do not dump triggers at all\n        - 'true' or 'no-definer': Dump trigger creation statement but remove DEFINER=... clause\n        - 'keep-definer': Keep the DEFINER=... clause\n  --\u003e\n  \u003ctable name=\"events\" dump=\"schema\" dump-triggers=\"false\" /\u003e\n\n  \u003c!--\n    View handling:\n\n    A configured \u003ctable\u003e may also be a database view. A CREATE VIEW statement will be issued\n    in that case, but the \"DEFINER=...\" clause will be removed to make it easier to re-import\n    the database e. g. in development environments.\n\n    You can change this by setting 'view-definers' to one of:\n        - 'no-definer': Dump view creation statement but remove DEFINER=... clause\n        - 'keep-definer': Keep the DEFINER=... clause\n    'no-definer' is the default if the 'view-definers'  attribute is omitted.\n  --\u003e\n  \u003ctable name=\"aggregated_data_view\" dump=\"schema\" view-definers=\"no-definer\" /\u003e\n\u003c/slimdump\u003e\n```\n\n### Conditions\n\nYou may want to select only some rows. In that case you can define a condition on a table.\n\n```xml\n\u003c?xml version=\"1.0\" ?\u003e\n\u003cslimdump\u003e\n  \u003c!-- Dump all users whose usernames begin with foo --\u003e\n  \u003ctable name=\"user\" dump=\"full\" condition=\"`username` LIKE 'foo%'\" /\u003e\n\u003c/slimdump\u003e\n```\n\nIn this example, only users with a username starting with 'foo' are exported:\nA simple way to export roughly a percentage of the users is this:\n\n```xml\n\u003c?xml version=\"1.0\" ?\u003e\n\u003cslimdump\u003e\n  \u003c!-- Dump every tenth user --\u003e\n  \u003ctable name=\"user\" dump=\"full\" condition=\"id % 10 = 0\" /\u003e\n\u003c/slimdump\u003e\n```\n\nThis will export only the users with an id divisible by ten without a remainder, e.g. about 1/10th of the user rows (given\nthe ids are evenly distributed).\n\nIf you want to keep referential integrity, you might have to configure a more complex condition like this:\n\n```xml\n\u003c?xml version=\"1.0\" ?\u003e\n\u003cslimdump\u003e\n  \u003c!-- Dump all users whose usernames begin with foo --\u003e\n  \u003ctable name=\"user\" dump=\"full\" condition=\"id IN (SELECT author_id FROM blog_posts UNION SELECT author_id from comments)\" /\u003e\n\u003c/slimdump\u003e\n```\n\nIn this case, we export only users that are referenced in other tables, e.g. that are authors of blog posts or comments.\n\n\n### Dump modes\n\nThe following modes are supported for the `dump` attribute:\n\n* `none` - Table is not dumped at all. Makes sense if you use broad wildcards (see below) and then want to exclude a specific table.\n* `schema` - Only the table schema will be dumped\n* `noblob` - Will dump a `NULL` value for BLOB fields\n* `full` - Whole table will be dumped\n* `masked` - Replaces all chars with \"x\". Mostly makes sense when applied on the column level, for example for email addresses or user names.\n* `replace` - When applied on a \u003ccolumn\u003e element, it replaces the values in this column with either a static value, or a nice dummy value generated by [Faker](https://github.com/fzaninotto/Faker/). Useful e.g. to replace passwords with a static one or to replace personal data like the first and last name with realistically sounding dummy data.\n\n### Wildcards\nOf course, you can use wildcards for table names (* for multiple characters, ? for a single character).\n\nExample:\n```xml\n\u003c?xml version=\"1.0\" ?\u003e\n\u003cslimdump\u003e\n  \u003c!-- Default: dump all tables --\u003e\n  \u003ctable name=\"*\" dump=\"full\" /\u003e\n\n  \u003c!-- Dump all tables beginning with \"a_\" as schema --\u003e\n  \u003ctable name=\"a_*\" dump=\"schema\" /\u003e\n\n  \u003c!-- Dump \"big_blob_table\" without blobs --\u003e\n  \u003ctable name=\"big_blob_table\" dump=\"noblob\" /\u003e\n\n  \u003c!-- Do not dump any tables ending with \"_test\" --\u003e\n  \u003ctable name=\"*_test\" dump=\"none\" /\u003e\n\u003c/slimdump\u003e\n```\nThis is a valid configuration. If more than one instruction matches a specific table name, the most specific one will be used. E.g. if you have definitions for blog_* and blog_author, the latter will be used for your author table, independent of their sequence order in the config.\n\n### Replacements\n\nYou probably don't want to use any personal data from your database. Therefore, slimdump allows you to replace data on\ncolumn level - a great instrument not only for General Data Protection Regulation (GDPR) compliance.\n\nThe simplest replacement is a static one:\n\n```xml\n\u003c?xml version=\"1.0\" ?\u003e\n\u003cslimdump\u003e\n    \u003ctable name=\"users\" dump=\"full\"\u003e\n        \u003ccolumn name=\"password\" dump=\"replace\" replacement=\"test\" /\u003e\n    \u003c/table\u003e\n\u003c/slimdump\u003e\n```\n\nThis replaces the password values of all users with \"test\" (in clear text - but for sure you have [some sort of hashing in place](https://secure.php.net/manual/en/faq.passwords.php), do you?).\n\nTo achieve realistically sounding dummy data, slimdump also allows [basic Faker formatters](https://github.com/fzaninotto/Faker/#formatters).\nYou can use every Faker formatter which needs no arguments and modifiers such as `unique` (just seperate the modifier\nwith an object operator (`-\u003e`), as you would do in PHP). This is especially useful if your table has a unique constraint\non a column containing personal information, like the email address.\n\n```xml\n\u003c?xml version=\"1.0\" ?\u003e\n\u003cslimdump\u003e\n    \u003ctable name=\"users\" dump=\"full\"\u003e\n        \u003ccolumn name=\"username\" dump=\"replace\" replacement=\"FAKER_word\" /\u003e\n        \u003ccolumn name=\"password\" dump=\"replace\" replacement=\"test\" /\u003e\n        \u003ccolumn name=\"firstname\" dump=\"replace\" replacement=\"FAKER_firstName\" /\u003e\n        \u003ccolumn name=\"lastname\" dump=\"replace\" replacement=\"FAKER_lastName\" /\u003e\n        \u003ccolumn name=\"email\" dump=\"replace\" replacement=\"FAKER_unique-\u003esafeEmail\" /\u003e\n    \u003c/table\u003e\n\u003c/slimdump\u003e\n```\n\n## Other databases\nCurrently, only MySQL is supported. Feel free to port it to the database of your needs.\n\n## Development\n\n### Building the Phar\n\n* Make sure [Phive](https://phar.io/) is installed\n* Run `phive install` to install tools, including [Box](https://github.com/humbug/box)\n* Run `composer install --no-dev` to make sure the `vendor/` folder is up to date\n* Run `tools/box compile` to build `slimdump.phar`.\n\n### Tests\n\nYou can execute the phpunit-tests by calling `vendor/bin/phpunit`.\n\n## Credits, Copyright and License\n\nThis tool was written by webfactory GmbH, Bonn, Germany. We're a software development agency with a focus on PHP (mostly [Symfony](http://github.com/symfony/symfony)). We're big fans of automation, DevOps, CI and CD, and of open source in general.\n\nIf you're a developer looking for new challenges, we'd like to hear from you! Otherwise, if this tool is useful for you, add a ⭐️.\n\n- \u003chttps://www.webfactory.de\u003e\n- \u003chttps://twitter.com/webfactory\u003e\n\nCopyright 2014-2022 webfactory GmbH, Bonn. Code released under [the MIT license](LICENSE).\n","funding_links":[],"categories":["杂项 Miscellaneous","目录","Table of Contents","PHP","Configuration"],"sub_categories":["数据库 Database","Database","Miscellaneous"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fwebfactory%2Fslimdump","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fwebfactory%2Fslimdump","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fwebfactory%2Fslimdump/lists"}