{"id":29117037,"url":"https://github.com/aarondl/sqlboiler","last_synced_at":"2025-12-14T23:51:17.473Z","repository":{"id":37484205,"uuid":"52193726","full_name":"aarondl/sqlboiler","owner":"aarondl","description":"Generate a Go ORM tailored to your database schema.","archived":false,"fork":false,"pushed_at":"2025-09-09T07:54:15.000Z","size":11872,"stargazers_count":6967,"open_issues_count":104,"forks_count":558,"subscribers_count":70,"default_branch":"master","last_synced_at":"2025-11-22T14:54:41.467Z","etag":null,"topics":["database","go","golang","mssql","mysql","orm","postgres","postgresql","sqlboiler","sqlite3"],"latest_commit_sha":null,"homepage":"","language":"Go","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"other","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/aarondl.png","metadata":{"files":{"readme":"README.md","changelog":"CHANGELOG.md","contributing":"CONTRIBUTING.md","funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2016-02-21T06:18:25.000Z","updated_at":"2025-11-16T01:16:22.000Z","dependencies_parsed_at":"2024-01-02T21:30:10.273Z","dependency_job_id":"ded4ce2b-49fe-4521-ac01-6e5f2ac047f4","html_url":"https://github.com/aarondl/sqlboiler","commit_stats":{"total_commits":1715,"total_committers":183,"mean_commits":9.371584699453551,"dds":0.5265306122448979,"last_synced_commit":"8c10c060bc36f4f3be00322212e96402b03732b4"},"previous_names":["blackbaronstux/sqlboiler","nullbio/sqlboiler","aarondl/sqlboiler","volatiletech/sqlboiler"],"tags_count":92,"template":false,"template_full_name":null,"purl":"pkg:github/aarondl/sqlboiler","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/aarondl%2Fsqlboiler","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/aarondl%2Fsqlboiler/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/aarondl%2Fsqlboiler/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/aarondl%2Fsqlboiler/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/aarondl","download_url":"https://codeload.github.com/aarondl/sqlboiler/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/aarondl%2Fsqlboiler/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":27739523,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-12-14T02:00:11.348Z","response_time":56,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["database","go","golang","mssql","mysql","orm","postgres","postgresql","sqlboiler","sqlite3"],"created_at":"2025-06-29T11:13:56.378Z","updated_at":"2025-12-14T23:51:17.447Z","avatar_url":"https://github.com/aarondl.png","language":"Go","readme":"![sqlboiler logo](https://i.imgur.com/lMXUTPE.png)\n\n[![License](https://img.shields.io/badge/license-BSD-blue.svg)](https://github.com/aarondl/sqlboiler/blob/master/LICENSE)\n[![GoDoc](https://img.shields.io/badge/godoc-reference-5272B4)](https://pkg.go.dev/mod/github.com/aarondl/sqlboiler/v4)\n[![Slack](https://img.shields.io/badge/slack-%23general-lightgrey.svg)](https://sqlboiler.from-the.cloud)\n![ActionsCI](https://github.com/aarondl/sqlboiler/workflows/test/badge.svg)\n[![Go Report Card](https://goreportcard.com/badge/aarondl/sqlboiler)](http://goreportcard.com/report/aarondl/sqlboiler)\n\n# Maintenance Mode\n\nThis package is currently in maintenance mode, which means:\n\n1. It generally does not accept new features.\n2. It does accept bug fixes and version compatability changes provided by the community.\n3. Maintainers usually do not resolve reported issues.\n4. Community members are encouraged to help each other with reported issues.\n\n## Alternatives\n\nIf looking for an actively maintained alternative, consider the following:\n\n### 1. Bob - \u003chttps://github.com/stephenafamo/bob\u003e\n\nBob is very similar to SQLBoiler. It was directly inspired by SQLBoiler and was created by a maintainer of SQLBoiler.\n\nA comparison can be found here: \u003chttps://bob.stephenafamo.com/vs/sqlboiler/\u003e.\n\n### 2. sqlc - \u003chttps://github.com/sqlc-dev/sqlc\u003e\n\n`sqlc` is a command line tool that generates type-safe code from SQL.  \nIt is not an ORM but for many use cases it can be a good alternative to SQLBoiler.\n\n# About SQLBoiler\n\nSQLBoiler is a tool to generate a Go ORM tailored to your database schema.\n\nIt is a \"database-first\" ORM as opposed to \"code-first\" (like gorm/gorp).\nThat means you must first create your database schema. Please use something\nlike [sql-migrate](https://github.com/rubenv/sql-migrate)\nor some other migration tool to manage this part of the database's life-cycle.\n\n# Note on versions\n\nv1, v2, and v3 are no longer maintained.\n\nv3 is the last GOPATH-compatible version.\n\nv4 has no real breaking changes between v3 and itself other than Go modules\nand is the only maintained version. Note this does not work with GOPATH\nprojects.\n\n## Why another ORM\n\nWhile attempting to migrate a legacy Rails database, we realized how much ActiveRecord benefited us in terms of development velocity.\nComing over to the Go `database/sql` package after using ActiveRecord feels extremely repetitive, super long-winded and down-right boring.\nBeing Go veterans we knew the state of ORMs was shaky, and after a quick review we found what our fears confirmed. Most packages out\nthere are code-first, reflect-based and have a very weak story around relationships between models. So with that we set out with these goals:\n\n- Work with existing databases: Don't be the tool to define the schema, that's better left to other tools.\n- ActiveRecord-like productivity: Eliminate all sql boilerplate, have relationships as a first-class concept.\n- Go-like feel: Work with normal structs, call functions, no hyper-magical struct tags, small interfaces.\n- Go-like performance: [Benchmark](#benchmarks) and optimize the hot-paths, perform like hand-rolled `sql.DB` code.\n\nWe believe with SQLBoiler and our database-first code-generation approach we've been able to successfully meet all of these goals. On top\nof that SQLBoiler also confers the following benefits:\n\n- The models package is type safe. This means no chance of random panics due to passing in the wrong type. No need for interface{}.\n- Our types closely correlate to your database column types. This is expanded by our extended null package which supports nearly all Go data types.\n- A system that is easy to debug. Your ORM is tailored to your schema, the code paths should be easy to trace since it's not all buried in reflect.\n- Auto-completion provides work-flow efficiency gains.\n\n# Table of Contents\n\n- [SQLBoiler](#sqlboiler)\n  - [Why another ORM](#why-another-orm)\n  - [About SQL Boiler](#about-sql-boiler)\n    - [Features](#features)\n    - [Missing Features](#missing-features)\n    - [Supported Databases](#supported-databases)\n    - [A Small Taste](#a-small-taste)\n  - [Requirements \u0026amp; Pro Tips](#requirements--pro-tips)\n    - [Requirements](#requirements)\n    - [Pro Tips](#pro-tips)\n  - [Getting started](#getting-started)\n    - [Videos](#videos)\n    - [Download](#download)\n    - [Configuration](#configuration)\n    - [Initial Generation](#initial-generation)\n    - [Regeneration](#regeneration)\n    - [Controlling Version](#controlling-version)\n    - [Controlling Generation](#controlling-generation)\n      - [Aliases](#aliases)\n      - [Types](#types)\n      - [Imports](#imports)\n      - [Templates](#templates)\n    - [Extending Generated Models](#extending-generated-models)\n  - [Diagnosing Problems](#diagnosing-problems)\n  - [Features \u0026amp; Examples](#features--examples)\n    - [Automatic CreatedAt/UpdatedAt](#automatic-createdatupdatedat)\n      - [Skipping Automatic Timestamps](#skipping-automatic-timestamps)\n      - [Overriding Automatic Timestamps](#overriding-automatic-timestamps)\n    - [Query Building](#query-building)\n    - [Query Mod System](#query-mod-system)\n    - [Function Variations](#function-variations)\n    - [Finishers](#finishers)\n    - [Raw Query](#raw-query)\n    - [Binding](#binding)\n    - [Relationships](#relationships)\n    - [Hooks](#hooks)\n      - [Skipping Hooks](#skipping-hooks)\n    - [Transactions](#transactions)\n    - [Debug Logging](#debug-logging)\n    - [Select](#select)\n    - [Find](#find)\n    - [Insert](#insert)\n    - [Update](#update)\n    - [Delete](#delete)\n    - [Upsert](#upsert)\n    - [Reload](#reload)\n    - [Exists](#exists)\n    - [Enums](#enums)\n    - [Constants](#constants)\n  - [FAQ](#faq)\n    - [Won't compiling models for a huge database be very slow?](#wont-compiling-models-for-a-huge-database-be-very-slow)\n    - [Missing imports for generated package](#missing-imports-for-generated-package)\n    - [How should I handle multiple schemas](#how-should-i-handle-multiple-schemas)\n    - [How do I use the types.BytesArray for Postgres bytea arrays?](#how-do-i-use-typesbytesarray-for-postgres-bytea-arrays)\n    - [Why aren't my time.Time or null.Time fields working in MySQL?](#why-arent-my-timetime-or-nulltime-fields-working-in-mysql)\n    - [Where is the homepage?](#where-is-the-homepage)\n    - [Why are the auto-generated tests failing?](#why-are-the-auto-generated-tests-failing)\n- [Benchmarks](#benchmarks)\n- [Third-Party Extensions](#third-party-extensions)\n\n## About SQL Boiler\n\n### Features\n\n- Full model generation\n- Extremely fast code generation\n- High performance through generation \u0026 intelligent caching\n- Uses boil.Executor (simple interface, sql.DB, sqlx.DB etc. compatible)\n- Uses context.Context\n- Easy workflow (models can always be regenerated, full auto-complete)\n- Strongly typed querying (usually no converting or binding to pointers)\n- Hooks (Before/After Create/Select/Update/Delete/Upsert)\n- Automatic CreatedAt/UpdatedAt\n- Automatic DeletedAt\n- Table and column whitelist/blacklist\n- Relationships/Associations\n- Eager loading (recursive)\n- Custom struct tags\n- Transactions\n- Raw SQL fallback\n- Compatibility tests (Run against your own DB schema)\n- Debug logging\n- Basic multiple schema support (no cross-schema support)\n- 1d arrays, json, hstore \u0026 more\n- Enum types\n- Out of band driver support\n- Support for database views\n- Supports generated/computed columns\n\n### Missing features\n\n- Multi-column foreign key support\n- Materialized view support\n  - Only postgresql is supported\n\n### Supported Databases\n\n| Database          | Driver Location                                                                                     |\n| ----------------- | --------------------------------------------------------------------------------------------------- |\n| PostgreSQL        | [https://github.com/aarondl/sqlboiler/v4/drivers/sqlboiler-psql](drivers/sqlboiler-psql)       |\n| MySQL             | [https://github.com/aarondl/sqlboiler/v4/drivers/sqlboiler-mysql](drivers/sqlboiler-mysql)     |\n| MSSQLServer 2012+ | [https://github.com/aarondl/sqlboiler/v4/drivers/sqlboiler-mssql](drivers/sqlboiler-mssql)     |\n| SQLite3           | [https://github.com/aarondl/sqlboiler/v4/drivers/sqlboiler-sqlite3](drivers/sqlboiler-sqlite3) |\n| CockroachDB       | https://github.com/glerchundi/sqlboiler-crdb                                                        |\n\n**Note:** SQLBoiler supports out of band driver support so you can make your own\n\nWe are seeking contributors for other database engines.\n\n### A Small Taste\n\nFor a comprehensive list of available operations and examples please see [Features \u0026 Examples](#features--examples).\n\n```go\nimport (\n  // Import this so we don't have to use qm.Limit etc.\n  . \"github.com/aarondl/sqlboiler/v4/queries/qm\"\n)\n\n// Open handle to database like normal\ndb, err := sql.Open(\"postgres\", \"dbname=fun user=abc\")\nif err != nil {\n  return err\n}\n\n// If you don't want to pass in db to all generated methods\n// you can use boil.SetDB to set it globally, and then use\n// the G variant methods like so (--add-global-variants to enable)\nboil.SetDB(db)\nusers, err := models.Users().AllG(ctx)\n\n// Query all users\nusers, err := models.Users().All(ctx, db)\n\n// Panic-able if you like to code that way (--add-panic-variants to enable)\nusers := models.Users().AllP(db)\n\n// More complex query\nusers, err := models.Users(Where(\"age \u003e ?\", 30), Limit(5), Offset(6)).All(ctx, db)\n\n// Ultra complex query\nusers, err := models.Users(\n  Select(\"id\", \"name\"),\n  InnerJoin(\"credit_cards c on c.user_id = users.id\"),\n  Where(\"age \u003e ?\", 30),\n  AndIn(\"c.kind in ?\", \"visa\", \"mastercard\"),\n  Or(\"email like ?\", `%aol.com%`),\n  GroupBy(\"id\", \"name\"),\n  Having(\"count(c.id) \u003e ?\", 2),\n  Limit(5),\n  Offset(6),\n).All(ctx, db)\n\n// Use any \"boil.Executor\" implementation (*sql.DB, *sql.Tx, data-dog mock db)\n// for any query.\ntx, err := db.BeginTx(ctx, nil)\nif err != nil {\n  return err\n}\nusers, err := models.Users().All(ctx, tx)\n\n// Relationships\nuser, err := models.Users().One(ctx, db)\nif err != nil {\n  return err\n}\nmovies, err := user.FavoriteMovies().All(ctx, db)\n\n// Eager loading\nusers, err := models.Users(Load(\"FavoriteMovies\")).All(ctx, db)\nif err != nil {\n  return err\n}\nfmt.Println(len(users.R.FavoriteMovies))\n```\n\n## Requirements \u0026 Pro Tips\n\n### Requirements\n\n- Go 1.13, older Go versions are not supported.\n- Join tables should use a _composite primary key_.\n  - For join tables to be used transparently for relationships your join table must have\n    a _composite primary key_ that encompasses both foreign table foreign keys and\n    no other columns in the table. For example, on a join table named\n    `user_videos` you should have: `primary key(user_id, video_id)`, with both\n    `user_id` and `video_id` being foreign key columns to the users and videos\n    tables respectively and there are no other columns on this table.\n- MySQL 5.6.30 minimum; ssl-mode option is not supported for earlier versions.\n- For MySQL if using the `github.com/go-sql-driver/mysql` driver, please activate\n  [time.Time parsing](https://github.com/go-sql-driver/mysql#timetime-support) when making your\n  MySQL database connection. SQLBoiler uses `time.Time` and `null.Time` to represent time in\n  it's models and without this enabled any models with `DATE`/`DATETIME` columns will not work.\n\n### Pro Tips\n\n- SQLBoiler generates type safe identifiers for table names, table column names,\n  a table's relationship names and type-safe where clauses. You should use these\n  instead of strings due to the ability to catch more errors at compile time\n  when your database schema changes. See [Constants](#constants) for details.\n- It's highly recommended to use transactions where sqlboiler will be doing\n  multiple database calls (relationship setops with insertions for example) for\n  both performance and data integrity.\n- Foreign key column names should end with `_id`.\n  - Foreign key column names in the format `x_id` will generate clearer method names.\n    It is advisable to use this naming convention whenever it makes sense for your database schema.\n- If you never plan on using the hooks functionality you can disable generation of this\n  feature using the `--no-hooks` flag. This will save you some binary size.\n\n## Getting started\n\n#### Videos\n\nIf you like learning via a video medium, sqlboiler has a number of screencasts\navailable.\n\n_NOTE:_ These videos predate modules (v4), the installation/import paths will be\ndifferent though everything else should remain similar.\n\n[SQLBoiler: Getting Started](https://www.youtube.com/watch?v=y5utRS9axfg)\n\n[SQLBoiler: What's New in v3](https://www.youtube.com/watch?v=-B-OPsYRZJA)\n\n[SQLBoiler: Advanced Queries and Relationships](https://www.youtube.com/watch?v=iiJuM9NR8No)\n\n[Old (v2): SQLBoiler Screencast #1: How to get started](https://www.youtube.com/watch?v=fKmRemtmi0Y)\n\n#### Download\n\nFirst you have to install the code generator binaries. There's the main binary\nand then a separate driver binary (select the right one for your database).\n\nBe very careful when installing, there's confusion in the Go ecosystem and\nknowing what are the right commands to run for which Go version can be tricky.\nEnsure you don't forget any /v suffixes or you'll end up on an old version.\n\n```shell\n# Go 1.16 and above:\ngo install github.com/aarondl/sqlboiler/v4@latest\ngo install github.com/aarondl/sqlboiler/v4/drivers/sqlboiler-psql@latest\n\n# Go 1.15 and below:\n# Install sqlboiler v4 and the postgresql driver (mysql, mssql, sqlite3 also available)\n# NOTE: DO NOT run this inside another Go module (like your project) as it will\n# pollute your go.mod with a bunch of stuff you don't want and your binary\n# will not get installed.\nGO111MODULE=on go get -u -t github.com/aarondl/sqlboiler/v4\nGO111MODULE=on go get github.com/aarondl/sqlboiler/v4/drivers/sqlboiler-psql\n```\n\nTo install `sqlboiler` as a dependency in your project use the commands below\ninside of your go module's directory tree. This will install the dependencies\ninto your `go.mod` file at the correct version.\n\n```shell\n# Do not forget the trailing /v4 and /v8 in the following commands\ngo get github.com/aarondl/sqlboiler/v4\n# Assuming you're going to use the null package for its additional null types\ngo get github.com/aarondl/null/v8\n```\n\n#### Configuration\n\nCreate a configuration file. Because the project uses\n[viper](https://github.com/spf13/viper), TOML, JSON and YAML are all usable\nbut only TOML is supported. Environment variables are also able to be used.\n\nThe configuration file should be named `sqlboiler.toml` and is searched for in\nthe following directories in this order:\n\n- `./`\n- `$XDG_CONFIG_HOME/sqlboiler/`\n- `$HOME/.config/sqlboiler/`\n\nWe will assume TOML for the rest of the documentation.\n\n##### Database Driver Configuration\n\nThe configuration for a specific driver (in these examples we'll use `psql`)\nmust all be prefixed by the driver name. You must use a configuration file or\nenvironment variables for configuring the database driver; there are no\ncommand-line options for providing driver-specific configuration.\n\nIn the configuration file for postgresql for example you would do:\n\n```toml\n[psql]\ndbname = \"your_database_name\"\n```\n\nWhen you use an environment variable it must also be prefixed by the driver\nname:\n\n```sh\nPSQL_DBNAME=\"your_database_name\"\n```\n\nThe values that exist for the drivers:\n\n| Name        | Required | Postgres Default | MySQL Default | MSSQL Default |\n| ----------- | -------- | ---------------- | ------------- | ------------- |\n| schema      | no       | \"public\"         | none          | \"dbo\"         |\n| dbname      | yes      | none             | none          | none          |\n| host        | yes      | none             | none          | none          |\n| port        | no       | 5432             | 3306          | 1433          |\n| user        | yes      | none             | none          | none          |\n| pass        | no       | none             | none          | none          |\n| sslmode     | no       | \"require\"        | \"true\"        | \"true\"        |\n| unix-socket | no       | N/A              | \"\"            | N/A           |\n| whitelist   | no       | []               | []            | []            |\n| blacklist   | no       | []               | []            | []            |\n\nExample of whitelist/blacklist:\n\n```toml\n[psql]\n# Removes migrations table, the name column from the addresses table, and\n# secret_col of any table from being generated. Foreign keys that reference tables\n# or columns that are no longer generated because of whitelists or blacklists may\n# cause problems.\nblacklist = [\"migrations\", \"addresses.name\", \"*.secret_col\"]\n```\n\n##### Generic config options\n\nYou can also pass in these top level configuration values if you would prefer\nnot to pass them through the command line or environment variables:\n\n| Name                      | Defaults |\n| ------------------------- | -------- |\n| pkgname                   | \"models\" |\n| output                    | \"models\" |\n| tag                       | []       |\n| debug                     | false    |\n| add-global-variants       | false    |\n| add-panic-variants        | false    |\n| add-enum-types            | false    |\n| enum-null-prefix          | \"Null\"   |\n| no-context                | false    |\n| no-hooks                  | false    |\n| no-tests                  | false    |\n| no-auto-timestamps        | false    |\n| no-rows-affected          | false    |\n| no-driver-templates       | false    |\n| no-relation-getters       | false    |\n| tag-ignore                | []       |\n| strict-verify-mod-version | false    |\n\n##### Full Example\n\n```toml\noutput   = \"my_models\"\nwipe     = true\nno-tests = true\nadd-enum-types = true\n\n[psql]\n  dbname = \"dbname\"\n  host   = \"localhost\"\n  port   = 5432\n  user   = \"dbusername\"\n  pass   = \"dbpassword\"\n  schema = \"myschema\"\n  blacklist = [\"migrations\", \"other\"]\n\n[mysql]\n  dbname  = \"dbname\"\n  host    = \"localhost\"\n  port    = 3306\n  user    = \"dbusername\"\n  pass    = \"dbpassword\"\n  sslmode = \"false\"\n  tinyint_as_int = true\n\n[mssql]\n  dbname  = \"dbname\"\n  host    = \"localhost\"\n  port    = 1433\n  user    = \"dbusername\"\n  pass    = \"dbpassword\"\n  sslmode = \"disable\"\n  schema  = \"notdbo\"\n```\n\n#### Initial Generation\n\nAfter creating a configuration file that points at the database we want to\ngenerate models for, we can invoke the sqlboiler command line utility.\n\n```text\nSQL Boiler generates a Go ORM from template files, tailored to your database schema.\nComplete documentation is available at http://github.com/aarondl/sqlboiler\n\nUsage:\n  sqlboiler [flags] \u003cdriver\u003e\n\nExamples:\nsqlboiler psql\n\nFlags:\n      --add-global-variants        Enable generation for global variants\n      --add-panic-variants         Enable generation for panic variants\n      --add-soft-deletes           Enable soft deletion by updating deleted_at timestamp\n      --add-enum-types             Enable generation of types for enums\n      --enum-null-prefix           Name prefix of nullable enum types (default \"Null\")\n  -c, --config string              Filename of config file to override default lookup\n  -d, --debug                      Debug mode prints stack traces on error\n  -h, --help                       help for sqlboiler\n      --no-auto-timestamps         Disable automatic timestamps for created_at/updated_at\n      --no-back-referencing        Disable back referencing in the loaded relationship structs\n      --no-context                 Disable context.Context usage in the generated code\n      --no-driver-templates        Disable parsing of templates defined by the database driver\n      --no-hooks                   Disable hooks feature for your models\n      --no-rows-affected           Disable rows affected in the generated API\n      --no-tests                   Disable generated go test files\n      --no-relation-getters        Disable generating getters for relationship tables\n  -o, --output string              The name of the folder to output to (default \"models\")\n  -p, --pkgname string             The name you wish to assign to your generated package (default \"models\")\n      --struct-tag-casing string   Decides the casing for go structure tag names. camel, title, alias or snake (default \"snake\")\n  -t, --tag strings                Struct tags to be included on your models in addition to json, yaml, toml\n      --tag-ignore strings         List of column names that should have tags values set to '-' (ignored during parsing)\n      --templates strings          A templates directory, overrides the embedded template folders in sqlboiler\n      --replace strings  An array of templates file and the actual template file to be replaces \n      --version                    Print the version\n      --strict-verify-mod-version  Prevent code generation, if project version of sqlboiler not match with executable\n      --wipe                       Delete the output folder (rm -rf) before generation to ensure sanity\n```\n\nFollow the steps below to do some basic model generation. Once you've generated\nyour models, you can run the compatibility tests which will exercise the entirety\nof the generated code. This way you can ensure that your database is compatible\nwith SQLBoiler. If you find there are some failing tests, please check the\n[Diagnosing Problems](#diagnosing-problems) section.\n\n```sh\n# Generate our models and exclude the migrations table\n# When passing 'psql' here, it looks for a binary called\n# 'sqlboiler-psql' in your CWD and PATH. You can also pass\n# an absolute path to a driver if you desire.\nsqlboiler psql\n\n# Run the generated tests\ngo test ./models\n```\n\n_Note: No `mysqldump` or `pg_dump` equivalent for Microsoft SQL Server, so generated tests must be supplemented by `tables_schema.sql` with `CREATE TABLE ...` queries_\n\nYou can use `go generate` for SQLBoiler if you want to to make it easy to\nrun the command for your application:\n\n```go\n//go:generate sqlboiler --flags-go-here psql\n```\n\nIt's important to not modify anything in the output folder, which brings us to\nthe next topic: regeneration.\n\n#### Regeneration\n\nWhen regenerating the models it's recommended that you completely delete the\ngenerated directory in a build script or use the `--wipe` flag in SQLBoiler.\nThe reasons for this are that sqlboiler doesn't try to diff your files in any\nsmart way, it simply writes the files it's going to write whether they're there\nor not and doesn't delete any files that were added by you or previous runs of\nSQLBoiler. In the best case this can cause compilation errors, in the worst case\nthis may leave extraneous and unusable code that was generated against tables\nthat are no longer in the database.\n\nThe bottom line is that this tool should always produce the same result from\nthe same source. And the intention is to always regenerate from a pure state.\nThe only reason the `--wipe` flag isn't defaulted to on is because we don't\nlike programs that `rm -rf` things on the filesystem without being asked to.\n\n#### Controlling Version\n\nWhen sqlboiler is used on a regular basis, sometimes problems arise on the\ndevelopers' side that the version they are using does not match the version\nspecified in the project.\n\nSqlboiler will warn, if version in project and executable mismatch.\nSqlboiler can also fail to prevent code generation, when\n`--strict-verify-mod-version` flag (or aliased version in toml) is enabled.\n\n#### Controlling Generation\n\nThe templates get executed in a specific way each time. There's a variety of\nconfiguration options on the command line/config file that can control what\nfeatures are turned on or off.\n\nIn addition to the command line flags there are a few features that are only\navailable via the config file and can use some explanation.\n\n##### Aliases\n\nIn sqlboiler, names are automatically generated for you. If you name your\ndatabase entities properly you will likely have descriptive names generated in\nthe end. However in the case where the names in your database are bad AND\nunchangeable, or sqlboiler's inference doesn't understand the names you do have\n(even though they are good and correct) you can use aliases to change the name\nof your tables, columns and relationships in the generated Go code.\n\n_Note: It is not required to provide all parts of all names. Anything left out\nwill be inferred as it was in the past._\n\n```toml\n# Although team_names works fine without configuration, we use it here for illustrative purposes\n[aliases.tables.team_names]\nup_plural     = \"TeamNames\"\nup_singular   = \"TeamName\"\ndown_plural   = \"teamNames\"\ndown_singular = \"teamName\"\n\n  # Columns can also be aliased.\n  [aliases.tables.team_names.columns]\n  team_name = \"OurTeamName\"\n```\n\nWhen creating aliases for relationships, it's important to know how sqlboiler\nnames relationships. For a given table the foreign key name is used as a unique\nidentifier to refer to a given relationship. If you are going to be aliasing\nrelationships it's **highly recommended** that you name your foreign keys\nexplicitly in your database or the auto-generated names could one day\nchange/break your aliases.\n\nEach relationship has a **local** and a **foreign** function name. The function name will\nbe inserted into your generated code as a function to retrieve relationship data as\nwell as refer to the relationship in a few other places. **local** means \"the function name\nthat refers to the table with the foreign key on it\" and conversely **foreign**\nmeans \"the function that refers to the table the foreign key points to\".\n\nFor example - let's have a `videos -\u003e users` many to one relationship that looks\nlike this:\n\n```text\nThe tables and their columns:\n\n| videos  | users |\n|---------|-------|\n| user_id | id    |\n\nOur foreign key:\nvideos_user_id_fkey: videos.user_id -\u003e users.id\n```\n\nIn this example `local` (how we refer to the table with the foreign key) is\ngoing to be inferred as `Videos`. We're going to override that below to be\n`AuthoredVideos`.\n\nConversely `foreign` (how we refer to the table the foreign key points to) is\ngoing to be inferred as `User`, which we'd like to rename to `Author` to suit\nour domain language a bit better.\n\nWith the configuration snippet below we can use the following relationship\nhelper functions off of the respective models: `video.Author` and\n`user.AuthoredVideos` which make a bit more sense than the inferred names when\nwe see it in the code for our domain. Note the use of the foreign key name to\nrefer to the relationship in the configuration key.\n\n```toml\n[aliases.tables.videos.relationships.videos_author_id_fkey]\n# The local side would originally be inferred as AuthorVideos, which\n# is probably good enough to not want to mess around with this feature, avoid it where possible.\nlocal   = \"AuthoredVideos\"\n# Even if left unspecified, the foreign side would have been inferred correctly\n# due to the proper naming of the foreign key column.\nforeign = \"Author\"\n```\n\nIn a many-to-many relationship it's a bit more complicated. Let's look at an\nexample relationship between `videos \u003c-\u003e tags` with a join table in the middle.\nImagine if the join table didn't exist, and instead both of the id columns in\nthe join table were slapped on to the tables themselves. You'd have\n`videos.tag_id` and `tags.video_id`. Using a similar method to the above (local\nis the name with which we refer to the side that has the foreign key)\nwe can rename the relationships. To change `Videos.Tags` to `Videos.Rags`\nwe can use the example below.\n\nKeep in mind that naming ONE side of the many-to-many relationship is sufficient\nas the other side will be automatically mirrored, though you can specify both if\nyou so choose.\n\n```toml\n[aliases.tables.video_tags.relationships.fk_video_id]\nlocal   = \"Rags\"\nforeign = \"Videos\"\n```\n\nThe above definition will specify `Rags` as the name of the property with which\na given `Video` entity will be able to access all of it's tags. If we look the\nother way around - a single `Tag` entity will refer to all videos that have that\nspecific tag with the `Videos` property.\n\nThere is an alternative syntax available for those who are challenged by the key\nsyntax of toml or challenged by viper lowercasing all of your keys. Instead of\nusing a regular table in toml, use an array of tables, and add a name field to\neach object. The only one that changes past that is columns, which now has to\nhave a new field called `alias`.\n\n```toml\n[[aliases.tables]]\nname          = \"team_names\"\nup_plural     = \"TeamNames\"\nup_singular   = \"TeamName\"\ndown_plural   = \"teamNames\"\ndown_singular = \"teamName\"\n\n  [[aliases.tables.columns]]\n  name  = \"team_name\"\n  alias = \"OurTeamName\"\n\n  [[aliases.tables.relationships]]\n  name    = \"fk_video_id\"\n  local   = \"Rags\"\n  foreign = \"Videos\"\n```\n\n##### Custom Struct Tag Case\n\nSometimes you might want to customize the case style for different purpose, for example, use camel case for json format and use snake case for yaml,\nYou may create a section named `[struct-tag-cases]` to define these custom case for each different format:\n\n```toml\n[struct-tag-cases]\ntoml = \"snake\"\nyaml = \"camel\"\njson = \"camel\"\nboil = \"alias\"\n```\n\nBy default, the snake case will be used, so you can just setup only few formats:\n\n```toml\n[struct-tag-cases]\njson = \"camel\"\n```\n\n##### Foreign Keys\n\nYou can add foreign keys not defined in the database to your models using the following configuration:\n\n```toml\n[foreign_keys.jet_pilots_fkey]\ntable = \"jets\"\ncolumn = \"pilot_id\"\nforeign_table = \"pilots\"\nforeign_column = \"id\"\n\n[foreign_keys.pilot_language_pilots_fkey]\ntable = \"pilot_languages\"\ncolumn = \"pilot_id\"\nforeign_table = \"pilots\"\nforeign_column = \"id\"\n\n[foreign_keys.pilot_language_languages_fkey]\ntable = \"pilot_languages\"\ncolumn = \"language_id\"\nforeign_table = \"languages\"\nforeign_column = \"id\"\n```\n\n##### Inflections\n\nWith inflections, you can control the rules sqlboiler uses to generates singular/plural variants. This is useful if a certain word or suffix is used multiple times and you do not want to create aliases for every instance.\n\n```toml\n[inflections.plural]\n# Rules to convert a suffix to its plural form\nium = \"ia\"\n\n[inflections.plural_exact]\n# Rules to convert an exact word to its plural form\nstadium = \"stadia\"\n\n[inflections.singular]\n# Rules to convert a suffix to its singular form\nia = \"ium\"\n\n[inflections.singular_exact]\n# Rules to convert an exact word to its singular form\nstadia = \"stadium\"\n\n[inflections.irregular]\n# The singular -\u003e plural mapping of an exact word that doen't follow conventional rules\nradius = \"radii\"\n```\n\n##### Types\n\nThere exists the ability to override types that the driver has inferred.\nThe way to accomplish this is through the config file.\n\n```toml\n[[types]]\n  # The match is a drivers.Column struct, and matches on almost all fields.\n  # Notable exception for the unique bool. Matches are done\n  # with \"logical and\" meaning it must match all specified matchers.\n  # Boolean values are only checked if all the string specifiers match first,\n  # and they must always match.\n  #\n  # Not shown here: db_type is the database type and a very useful matcher\n  # We can also whitelist tables for this replace by adding to the types.match:\n  # tables = ['users', 'videos']\n  #\n  # Note there is precedence for types.match, more specific things should appear\n  # further down in the config as once a matching rule is found it is executed\n  # immediately.\n  [types.match]\n    type = \"null.String\"\n    nullable = true\n\n  # The replace is what we replace the strings with. You cannot modify any\n  # boolean values in here. But we could change the Go type (the most useful thing)\n  # or the DBType or FullDBType etc. if for some reason we needed to.\n  [types.replace]\n    type = \"mynull.String\"\n\n  # These imports specified here overwrite the definition of the type's \"based_on_type\"\n  # list. The type entry that is replaced is the replaced type's \"type\" field.\n  # In the above example it would add an entry for mynull.String, if we did not\n  # change the type in our replacement, it would overwrite the null.String entry.\n  [types.imports]\n    third_party = ['\"github.com/me/mynull\"']\n```\n\n##### Imports\n\nImports are overridable by the user. This can be used in conjunction with\nreplacing the templates for extreme cases. Typically this should be avoided.\n\nNote that specifying any section of the imports completely overwrites that\nsection. It's also true that the driver can still specify imports and those\nwill be merged in to what is provided here.\n\n```toml\n[imports.all]\n  standard = ['\"context\"']\n  third_party = ['\"github.com/my/package\"']\n\n# Changes imports for the boil_queries file\n[imports.singleton.\"boil_queries\"]\n  standard = ['\"context\"']\n  third_party = ['\"github.com/my/package\"']\n\n# Same syntax as all\n[imports.test]\n\n# Same syntax as singleton\n[imports.test_singleton]\n\n# Changes imports when a model contains null.Int32\n[imports.based_on_type.string]\n  standard = ['\"context\"']\n  third_party = ['\"github.com/my/package\"']\n```\n\nWhen defining maps it's possible to use an alternative syntax since\nviper automatically lowercases all configuration keys (same as aliases).\n\n```toml\n[[imports.singleton]]\n  name = \"boil_queries\"\n  third_party = ['\"github.com/my/package\"']\n\n[[imports.based_on_type]]\n  name = \"null.Int64\"\n  third_party = ['\"github.com/my/int64\"']\n```\n\n##### Templates\n\nIn advanced scenarios it may be desirable to generate additional files that are not go code.\nYou can accomplish this by using the `--templates` flag to specify **all** the directories you\nwish to generate code for. With this flag you specify root directories, that is top-level container\ndirectories.\n\nIf root directories have a `_test` suffix in the name, this folder is considered a folder\nfull of templates for testing only and will be omitted when `--no-tests` is specified and\nits templates will be generated into files with a `_test` suffix.\n\nEach root directory is recursively walked. Each template found will be merged into table_name.ext\nwhere ext is defined by the shared extension of the templates. The directory structure is preserved\nwith the exception of singletons.\n\nFor files that should not be generated for each model, you can use a `singleton` directory inside\nthe directory where the singleton file should be generated. This will make sure that the file is\nonly generated once.\n\nHere's an example:\n\n```text\ntemplates/\n├── 00_struct.go.tpl               # Merged into output_dir/table_name.go\n├── 00_struct.js.tpl               # Merged into output_dir/table_name.js\n├── singleton\n│   └── boil_queries.go.tpl        # Rendered as output_dir/boil_queries.go\n└── js\n    ├── jsmodel.js.tpl             # Merged into output_dir/js/table_name.js\n    └── singleton\n        └── jssingle.js.tpl        # Merged into output_dir/js/jssingle.js\n```\n\nThe output files of which would be:\n\n```\noutput_dir/\n├── boil_queries.go\n├── table_name.go\n├── table_name.js\n└── js\n    ├── table_name.js\n    └── jssingle.js\n```\n\n**Note**: Because the `--templates` flag overrides the embedded templates of `sqlboiler`, if you still\nwish to generate the default templates it's recommended that you include the path to sqlboiler's templates\nas well.\n\n```toml\ntemplates = [\n  \"/path/to/sqlboiler/templates\",\n  \"/path/to/sqlboiler/templates_test\",\n  \"/path/to/your_project/more_templates\"\n]\n```\n\n#### Extending generated models\n\nThere will probably come a time when you want to extend the generated models\nwith some kinds of helper functions. A general guideline is to put your\nextension functions into a separate package so that your functions aren't\naccidentally deleted when regenerating. Past that there are 3 main ways to\nextend the models, the first way is the most desirable:\n\n**Method 1: Simple Functions**\n\n```go\n// Package modext is for SQLBoiler helper methods\npackage modext\n\n// UserFirstTimeSetup is an extension of the user model.\nfunc UserFirstTimeSetup(ctx context.Context, db *sql.DB, u *models.User) error { ... }\n```\n\nCode organization is accomplished by using multiple files, and everything\nis passed as a parameter so these kinds of methods are very easy to test.\n\nCalling code is also very straightforward:\n\n```go\nuser, err := Users().One(ctx, db)\n// elided error check\n\nerr = modext.UserFirstTimeSetup(ctx, db, user)\n// elided error check\n```\n\n**Method 2: Empty struct methods**\n\nThe above is the best way to code extensions for SQLBoiler, however there may\nbe times when the number of methods grows too large and code completion is\nnot as helpful anymore. In these cases you may consider structuring the code\nlike this:\n\n```go\n// Package modext is for SQLBoiler helper methods\npackage modext\n\ntype users struct {}\n\nvar Users = users{}\n\n// FirstTimeSetup is an extension of the user model.\nfunc (users) FirstTimeSetup(ctx context.Context, db *sql.DB, u *models.User) error { ... }\n```\n\nCalling code then looks a little bit different:\n\n```go\nuser, err := Users().One(ctx, db)\n// elided error check\n\nerr = modext.Users.FirstTimeSetup(ctx, db, user)\n// elided error check\n```\n\nThis is almost identical to the method above, but gives slight amounts more\norganization at virtually no cost at runtime. It is however not as desirable\nas the first method since it does have some runtime cost and doesn't offer that\nmuch benefit over it.\n\n**Method 3: Embedding**\n\nThis pattern is not for the faint of heart, what it provides in benefits it\nmore than makes up for in downsides. It's possible to embed the SQLBoiler\nstructs inside your own to enhance them. However it's subject to easy breakages\nand a dependency on these additional objects. It can also introduce\ninconsistencies as some objects may have no extended functionality and therefore\nhave no reason to be embedded so you either have to have a struct for each\ngenerated struct even if it's empty, or have inconsistencies, some places where\nyou use the enhanced model, and some where you do not.\n\n```go\nuser, err := Users().One(ctx, db)\n// elided error check\n\nenhUser := modext.User{user}\nerr = ehnUser.FirstTimeSetup(ctx, db)\n// elided error check\n```\n\nI don't recommend this pattern, but included it so that people know it's an\noption and also know the problems with it.\n\n## Diagnosing Problems\n\nThe most common causes of problems and panics are:\n\n- Forgetting to exclude tables you do not want included in your generation, like migration tables.\n- Tables without a primary key. All tables require one.\n- Forgetting to put foreign key constraints on your columns that reference other tables.\n- The compatibility tests require privileges to create a database for testing purposes, ensure the user\n  supplied in your `sqlboiler.toml` config has adequate privileges.\n- A nil or closed database handle. Ensure your passed in `boil.Executor` is not nil.\n  - If you decide to use the `G` variant of functions instead, make sure you've initialized your\n    global database handle using `boil.SetDB()`.\n- Naming collisions, if the code fails to compile because there are naming collisions, look at the\n  [aliasing](#aliases) feature.\n- Race conditions in tests or when using global variable models and using\n  relationship set helpers in multiple goroutines. Note that Set/Add/Remove\n  relationship helpers modify their input parameters to maintain parity between\n  the `.R` struct relationships and the database foreign keys but this can\n  produce subtle race conditions. Test for this using the `-race` flag on the\n  go tool.\n- A field not being inserted (usually a default true boolean), `boil.Infer` looks at the zero\n  value of your Go type (it doesn't care what the default value in the database is) to determine\n  if it should insert your field or not. In the case of a default true boolean value, when you\n  want to set it to false; you set that in the struct but that's the zero value for the bool\n  field in Go so sqlboiler assumes you do not want to insert that field and you want the default\n  value from the database. Use a whitelist/greylist to add that field to the list of fields\n  to insert.\n- decimal library showing errors like: `pq: encode: unknown type types.NullDecimal`\n  is a result of a too-new and broken version of the github.com/ericlargergren/decimal\n  package, use the following version in your go.mod:\n  github.com/ericlagergren/decimal v0.0.0-20181231230500-73749d4874d5\n\nFor errors with other causes, it may be simple to debug yourself by looking at the generated code.\nSetting `boil.DebugMode` to `true` can help with this. You can change the output using `boil.DebugWriter` (defaults to `os.Stdout`).\n\nIf you're still stuck and/or you think you've found a bug, feel free to leave an issue and we'll do our best to help you.\n\n## Features \u0026 Examples\n\nMost examples in this section will be demonstrated using the following Postgres schema, structs and variables:\n\n```sql\nCREATE TABLE pilots (\n  id integer NOT NULL,\n  name text NOT NULL\n);\n\nALTER TABLE pilots ADD CONSTRAINT pilot_pkey PRIMARY KEY (id);\n\nCREATE TABLE jets (\n  id integer NOT NULL,\n  pilot_id integer NOT NULL,\n  age integer NOT NULL,\n  name text NOT NULL,\n  color text NOT NULL\n);\n\nALTER TABLE jets ADD CONSTRAINT jet_pkey PRIMARY KEY (id);\nALTER TABLE jets ADD CONSTRAINT jet_pilots_fkey FOREIGN KEY (pilot_id) REFERENCES pilots(id);\n\nCREATE TABLE languages (\n  id integer NOT NULL,\n  language text NOT NULL\n);\n\nALTER TABLE languages ADD CONSTRAINT language_pkey PRIMARY KEY (id);\n\n-- Join table\nCREATE TABLE pilot_languages (\n  pilot_id integer NOT NULL,\n  language_id integer NOT NULL\n);\n\n-- Composite primary key\nALTER TABLE pilot_languages ADD CONSTRAINT pilot_language_pkey PRIMARY KEY (pilot_id, language_id);\nALTER TABLE pilot_languages ADD CONSTRAINT pilot_language_pilots_fkey FOREIGN KEY (pilot_id) REFERENCES pilots(id);\nALTER TABLE pilot_languages ADD CONSTRAINT pilot_language_languages_fkey FOREIGN KEY (language_id) REFERENCES languages(id);\n```\n\nThe generated model structs for this schema look like the following. Note that we've included the relationship\nstructs as well so you can see how it all pieces together:\n\n```go\ntype Pilot struct {\n  ID   int    `boil:\"id\" json:\"id\" toml:\"id\" yaml:\"id\"`\n  Name string `boil:\"name\" json:\"name\" toml:\"name\" yaml:\"name\"`\n\n  R *pilotR `boil:\"-\" json:\"-\" toml:\"-\" yaml:\"-\"`\n  L pilotR  `boil:\"-\" json:\"-\" toml:\"-\" yaml:\"-\"`\n}\n\ntype pilotR struct {\n  Languages LanguageSlice\n  Jets      JetSlice\n}\n\ntype Jet struct {\n  ID      int    `boil:\"id\" json:\"id\" toml:\"id\" yaml:\"id\"`\n  PilotID int    `boil:\"pilot_id\" json:\"pilot_id\" toml:\"pilot_id\" yaml:\"pilot_id\"`\n  Age     int    `boil:\"age\" json:\"age\" toml:\"age\" yaml:\"age\"`\n  Name    string `boil:\"name\" json:\"name\" toml:\"name\" yaml:\"name\"`\n  Color   string `boil:\"color\" json:\"color\" toml:\"color\" yaml:\"color\"`\n\n  R *jetR `boil:\"-\" json:\"-\" toml:\"-\" yaml:\"-\"`\n  L jetR  `boil:\"-\" json:\"-\" toml:\"-\" yaml:\"-\"`\n}\n\ntype jetR struct {\n  Pilot *Pilot\n}\n\ntype Language struct {\n  ID       int    `boil:\"id\" json:\"id\" toml:\"id\" yaml:\"id\"`\n  Language string `boil:\"language\" json:\"language\" toml:\"language\" yaml:\"language\"`\n\n  R *languageR `boil:\"-\" json:\"-\" toml:\"-\" yaml:\"-\"`\n  L languageR  `boil:\"-\" json:\"-\" toml:\"-\" yaml:\"-\"`\n}\n\ntype languageR struct {\n  Pilots PilotSlice\n}\n```\n\n```go\n// Open handle to database like normal\ndb, err := sql.Open(\"postgres\", \"dbname=fun user=abc\")\nif err != nil {\n  return err\n}\n```\n\n### Automatic CreatedAt/UpdatedAt\n\nIf your generated SQLBoiler models package can find columns with the\nnames `created_at` or `updated_at` it will automatically set them\nto `time.Now()` in your database, and update your object appropriately.\nTo disable this feature use `--no-auto-timestamps`.\n\nNote: You can set the timezone for this feature by calling `boil.SetLocation()`\n\n#### Customizing the timestamp columns\n\nSet the `auto-columns` map in your configuration file\n\n```toml\n[auto-columns]\n    created = \"createdAt\"\n    updated = \"updatedAt\"\n```\n\n#### Skipping Automatic Timestamps\n\nIf for a given query you do not want timestamp columns to be re-computed prior\nto an insert or update then you can use `boil.SkipTimestamps` on the context you\npass in to the query to prevent them from being updated.\n\nKeep in mind this has no effect on whether or not the column is included in the\ninsert/update, it simply stops them from being set to `time.Now()` in the struct\nbefore being sent to the database (if they were going to be sent).\n\n#### Overriding Automatic Timestamps\n\n- **Insert**\n  - Timestamps for both `updated_at` and `created_at` that are zero values will be set automatically.\n  - To set the timestamp to null, set `Valid` to false and `Time` to a non-zero value.\n    This is somewhat of a work around until we can devise a better solution in a later version.\n- **Update**\n  - The `updated_at` column will always be set to `time.Now()`. If you need to override\n    this value you will need to fall back to another method in the meantime: `queries.Raw()`,\n    overriding `updated_at` in all of your objects using a hook, or create your own wrapper.\n- **Upsert**\n  - `created_at` will be set automatically if it is a zero value, otherwise your supplied value\n    will be used. To set `created_at` to `null`, set `Valid` to false and `Time` to a non-zero value.\n  - The `updated_at` column will always be set to `time.Now()`.\n\n### Automatic DeletedAt (Soft Delete)\n\nSoft deletes are a way of deleting records in a database for the average query\nwithout actually removing the data. This type of thing is important in certain\nscenarios where data retention is important. It is typically done by adding a\n`deleted` bool or a `deleted_at` timestamp to each table in the database\nthat can be soft deleted and subsequent queries on that table should always\nmake sure that `deleted != true` or `deleted_at is null` to prevent showing\n\"deleted\" data.\n\nSQLBoiler uses the `deleted_at` variant to provide this functionality. If your\ntable has a nullable timestamp field named `deleted_at` it will be a candidate\nfor soft-deletion.\n\n_NOTE_: As of writing soft-delete is opt-in via `--add-soft-deletes` and is\nliable to change in future versions.\n\n_NOTE_: There is a query mod to bypass soft delete for a specific query by using\n`qm.WithDeleted`, note that there is no way to do this for Exists/Find helpers\nyet.\n\n_NOTE_: The `Delete` helpers will _not_ set `updated_at` currently. The current\nphilosophy is that deleting the object is simply metadata and since it returns\nin no queries (other than raw ones) the updated_at will no longer be relevant.\nThis could change in future versions if people disagree with this but it is\nthe current behavior.\n\n### Query Building\n\nWe generate \"Starter\" methods for you. These methods are named as the plural versions of your model,\nfor example: `models.Jets()`. Starter methods are used to build queries using our\n[Query Mod System](#query-mod-system). They take a slice of [Query Mods](#query-mod-system)\nas parameters, and end with a call to a [Finisher](#finishers) method.\n\nHere are a few examples:\n\n```go\n// SELECT COUNT(*) FROM pilots;\ncount, err := models.Pilots().Count(ctx, db)\n\n// SELECT * FROM \"pilots\" LIMIT 5;\npilots, err := models.Pilots(qm.Limit(5)).All(ctx, db)\n\n// DELETE FROM \"pilots\" WHERE \"id\"=$1;\nerr := models.Pilots(qm.Where(\"id=?\", 1)).DeleteAll(ctx, db)\n// type safe version of above\nerr := models.Pilots(models.PilotWhere.ID.EQ(1)).DeleteAll(ctx, db)\n```\n\nIn the event that you would like to build a query and specify the table yourself, you\ncan do so using `models.NewQuery()`:\n\n```go\n// Select all rows from the pilots table by using the From query mod.\nerr := models.NewQuery(db, qm.From(\"pilots\")).All(ctx, db)\n```\n\nAs you can see, [Query Mods](#query-mod-system) allow you to modify your\nqueries, and [Finishers](#finishers) allow you to execute the final action.\n\nWe also generate query building helper methods for your relationships as well. Take a look at our\n[Relationships Query Building](#relationships) section for some additional query building information.\n\n### Query Mod System\n\nThe query mod system allows you to modify queries created with\n[Starter](#query-building) methods when performing query building.\nSee examples below.\n\n**NOTE:** SQLBoiler generates type-safe identifiers based on your database\ntables, columns and relationships. Using these is a bit more verbose, but is\nespecially safe since when the names change in the database the generated\ncode will be different causing compilation failures instead of runtime\nerrors. It is highly recommended you use these instead of regular strings.\nSee [Constants](#constants) for more details.\n\n**NOTE:** You will notice that there is printf used below mixed with SQL\nstatements. This is normally NOT OK if the user is able to supply any of\nthe sql string, but here we always use a `?` placeholder and pass arguments\nso that the only thing that's being printf'd are constants which makes it\nsafe, but be careful!\n\n```go\n// Dot import so we can access query mods directly instead of prefixing with \"qm.\"\nimport . \"github.com/aarondl/sqlboiler/v4/queries/qm\"\n\n// Use a raw query against a generated struct (Pilot in this example)\n// If this query mod exists in your call, it will override the others.\n// \"?\" placeholders are not supported here, use \"$1, $2\" etc.\nSQL(\"select * from pilots where id=$1\", 10)\nmodels.Pilots(SQL(\"select * from pilots where id=$1\", 10)).All()\n\nSelect(\"id\", \"name\") // Select specific columns.\nSelect(models.PilotColumns.ID, models.PilotColumns.Name)\nFrom(\"pilots as p\") // Specify the FROM table manually, can be useful for doing complex queries.\nFrom(models.TableNames.Pilots + \" as p\")\n\n// WHERE clause building\nWhere(\"name=?\", \"John\")\nmodels.PilotWhere.Name.EQ(\"John\")\nAnd(\"age=?\", 24)\n// No equivalent type safe query yet\nOr(\"height=?\", 183)\n// No equivalent type safe query yet\n\nWhere(\"(name=? and age=?) or (age=?)\", \"John\", 5, 6)\n// Expr allows manual grouping of statements\nWhere(\n  Expr(\n    models.PilotWhere.Name.EQ(\"John\"),\n    Or2(models.PilotWhere.Age.EQ(5)),\n  ),\n  Or2(models.PilotAge),\n)\n\n// WHERE IN clause building\nWhereIn(\"(name, age) in ?\", \"John\", 24, \"Tim\", 33) // Generates: WHERE (\"name\",\"age\") IN (($1,$2),($3,$4))\nWhereIn(fmt.Sprintf(\"(%s, %s) in ?\", models.PilotColumns.Name, models.PilotColumns.Age), \"John\", 24, \"Tim\", 33)\nAndIn(\"weight in ?\", 84)\nAndIn(models.PilotColumns.Weight + \" in ?\", 84)\nOrIn(\"height in ?\", 183, 177, 204)\nOrIn(models.PilotColumns.Height + \" in ?\", 183, 177, 204)\n\nInnerJoin(\"pilots p on jets.pilot_id=?\", 10)\nInnerJoin(models.TableNames.Pilots + \" p on \" + models.TableNames.Jets + \".\" + models.JetColumns.PilotID + \"=?\", 10)\n\nGroupBy(\"name\")\nGroupBy(\"name like ? DESC, name\", \"John\")\nGroupBy(models.PilotColumns.Name)\nOrderBy(\"age, height\")\nOrderBy(models.PilotColumns.Age, models.PilotColumns.Height)\n\nHaving(\"count(jets) \u003e 2\")\nHaving(fmt.Sprintf(\"count(%s) \u003e 2\", models.TableNames.Jets)\n\nLimit(15)\nOffset(5)\n\n// Explicit locking\nFor(\"update nowait\")\n\n// Common Table Expressions\nWith(\"cte_0 AS (SELECT * FROM table_0 WHERE thing=$1 AND stuff=$2)\")\n\n// Eager Loading -- Load takes the relationship name, ie the struct field name of the\n// Relationship struct field you want to load. Optionally also takes query mods to filter on that query.\nLoad(\"Languages\", Where(...)) // If it's a ToOne relationship it's in singular form, ToMany is plural.\nLoad(models.PilotRels.Languages, Where(...))\n```\n\nNote: We don't force you to break queries apart like this if you don't want to, the following\nis also valid and supported by query mods that take a clause:\n\n```go\nWhere(\"(name=? OR age=?) AND height=?\", \"John\", 24, 183)\n```\n\n### Function Variations\n\nFunctions can have variations generated for them by using the flags\n`--add-global-variants` and `--add-panic-variants`. Once you've used these\nflags or set the appropriate values in your configuration file extra method\noverloads will be generated. We've used the `Delete` method to demonstrate:\n\n```go\n// Set the global db handle for G method variants.\nboil.SetDB(db)\n\npilot, _ := models.FindPilot(ctx, db, 1)\n\nerr := pilot.Delete(ctx, db) // Regular variant, takes a db handle (boil.Executor interface).\npilot.DeleteP(ctx, db)       // Panic variant, takes a db handle and panics on error.\nerr := pilot.DeleteG(ctx)    // Global variant, uses the globally set db handle (boil.SetDB()).\npilot.DeleteGP(ctx)          // Global\u0026Panic variant, combines the global db handle and panic on error.\n\ndb.Begin()                   // Normal sql package way of creating a transaction\nboil.BeginTx(ctx, nil)       // Uses the global database handle set by boil.SetDB() (doesn't require flag)\n```\n\nNote that it's slightly different for query building.\n\n### Finishers\n\nHere are a list of all of the finishers that can be used in combination with\n[Query Building](#query-building).\n\nFinishers all have `P` (panic) [method variations](#function-variations). To specify\nyour db handle use the `G` or regular variation of the [Starter](#query-building) method.\n\n```go\n// These are called like the following:\nmodels.Pilots().All(ctx, db)\n\nOne() // Retrieve one row as object (same as LIMIT(1))\nAll() // Retrieve all rows as objects (same as SELECT * FROM)\nCount() // Number of rows (same as COUNT(*))\nUpdateAll(models.M{\"name\": \"John\", \"age\": 23}) // Update all rows matching the built query.\nDeleteAll() // Delete all rows matching the built query.\nExists() // Returns a bool indicating whether the row(s) for the built query exists.\nBind(\u0026myObj) // Bind the results of a query to your own struct object.\nExec() // Execute an SQL query that does not require any rows returned.\nQueryRow() // Execute an SQL query expected to return only a single row.\nQuery() // Execute an SQL query expected to return multiple rows.\n```\n\n### Raw Query\n\nWe provide `queries.Raw()` for executing raw queries. Generally you will want to use `Bind()` with\nthis, like the following:\n\n```go\nerr := queries.Raw(\"select * from pilots where id=$1\", 5).Bind(ctx, db, \u0026obj)\n```\n\nYou can use your own structs or a generated struct as a parameter to Bind. Bind supports both\na single object for single row queries and a slice of objects for multiple row queries.\n\n`queries.Raw()` also has a method that can execute a query without binding to an object, if required.\n\nYou also have `models.NewQuery()` at your disposal if you would still like to use [Query Building](#query-building)\nin combination with your own custom, non-generated model.\n\n### Binding\n\nFor a comprehensive ruleset for `Bind()` you can refer to our [pkg.go.dev](https://pkg.go.dev/github.com/aarondl/sqlboiler/v4/queries#Bind).\n\nThe `Bind()` [Finisher](#finisher) allows the results of a query built with\nthe [Raw SQL](#raw-query) method or the [Query Builder](#query-building) methods to be bound\nto your generated struct objects, or your own custom struct objects.\n\nThis can be useful for complex queries, queries that only require a small subset of data\nand have no need for the rest of the object variables, or custom join struct objects like\nthe following:\n\n```go\n// Custom struct using two generated structs\ntype PilotAndJet struct {\n  models.Pilot `boil:\",bind\"`\n  models.Jet   `boil:\",bind\"`\n}\n\nvar paj PilotAndJet\n// Use a raw query\nerr := queries.Raw(`\n  select pilots.id as \"pilots.id\", pilots.name as \"pilots.name\",\n  jets.id as \"jets.id\", jets.pilot_id as \"jets.pilot_id\",\n  jets.age as \"jets.age\", jets.name as \"jets.name\", jets.color as \"jets.color\"\n  from pilots inner join jets on jets.pilot_id=?`, 23,\n).Bind(ctx, db, \u0026paj)\n\n// Use query building\nerr := models.NewQuery(\n  Select(\"pilots.id\", \"pilots.name\", \"jets.id\", \"jets.pilot_id\", \"jets.age\", \"jets.name\", \"jets.color\"),\n  From(\"pilots\"),\n  InnerJoin(\"jets on jets.pilot_id = pilots.id\"),\n).Bind(ctx, db, \u0026paj)\n```\n\n```go\n// Custom struct for selecting a subset of data\ntype JetInfo struct {\n  AgeSum int `boil:\"age_sum\"`\n  Count int `boil:\"juicy_count\"`\n}\n\nvar info JetInfo\n\n// Use query building\nerr := models.NewQuery(Select(\"sum(age) as age_sum\", \"count(*) as juicy_count\", From(\"jets\"))).Bind(ctx, db, \u0026info)\n\n// Use a raw query\nerr := queries.Raw(`select sum(age) as \"age_sum\", count(*) as \"juicy_count\" from jets`).Bind(ctx, db, \u0026info)\n```\n\nWe support the following struct tag modes for `Bind()` control:\n\n```go\ntype CoolObject struct {\n  // Don't specify a name, Bind will TitleCase the column\n  // name, and try to match against this.\n  Frog int\n\n  // Specify an alternative name for the column, it will\n  // be titlecased for matching, can be whatever you like.\n  Cat int  `boil:\"kitten\"`\n\n  // Ignore this struct field, do not attempt to bind it.\n  Pig int  `boil:\"-\"`\n\n  // Instead of binding to this as a regular struct field\n  // (like other sql-able structs eg. time.Time)\n  // Recursively search inside the Dog struct for field names from the query.\n  Dog      `boil:\",bind\"`\n\n  // Same as the above, except specify a different table name\n  Mouse    `boil:\"rodent,bind\"`\n\n  // Ignore this struct field, do not attempt to bind it.\n  Bird     `boil:\"-\"`\n}\n```\n\n### Relationships\n\nHelper methods will be generated for every to one and to many relationship structure\nyou have defined in your database by using foreign keys.\n\nWe attach these helpers directly to your model struct, for example:\n\n```go\njet, _ := models.FindJet(ctx, db, 1)\n\n// \"to one\" relationship helper method.\n// This will retrieve the pilot for the jet.\npilot, err := jet.Pilot().One(ctx, db)\n\n// \"to many\" relationship helper method.\n// This will retrieve all languages for the pilot.\nlanguages, err := pilot.Languages().All(ctx, db)\n```\n\nIf your relationship involves a join table SQLBoiler will figure it out for you transparently.\n\nIt is important to note that you should use `Eager Loading` if you plan\non loading large collections of rows, to avoid N+1 performance problems.\n\nFor example, take the following:\n\n```go\n// Avoid this loop query pattern, it is slow.\njets, _ := models.Jets().All(ctx, db)\npilots := make([]models.Pilot, len(jets))\nfor i := 0; i \u003c len(jets); i++ {\n  pilots[i] = jets[i].Pilot().OneP(ctx, db)\n}\n\n// Instead, use Eager Loading!\njets, _ := models.Jets(Load(\"Pilot\")).All(ctx, db)\n// Type safe relationship names exist too:\njets, _ := models.Jets(Load(models.JetRels.Pilot)).All(ctx, db)\n\n// Then access the loaded structs using the special Relation field\nfor _, j := range jets {\n  _ = j.R.Pilot\n}\n```\n\nEager loading can be combined with other query mods, and it can also eager load recursively.\n\n```go\n// Example of a nested load.\n// Each jet will have its pilot loaded, and each pilot will have its languages loaded.\njets, _ := models.Jets(Load(\"Pilot.Languages\")).All(ctx, db)\n// Note that each level of a nested Load call will be loaded. No need to call Load() multiple times.\n\n// Type safe queries exist for this too!\njets, _ := models.Jets(Load(Rels(models.JetRels.Pilot, models.PilotRels.Languages))).All(ctx, db)\n\n// A larger example. In the below scenario, Pets will only be queried one time, despite\n// showing up twice because they're the same query (the user's pets)\nusers, _ := models.Users(\n  Load(\"Pets.Vets\"),\n  // the query mods passed in below only affect the query for Toys\n  // to use query mods against Pets itself, you must declare it separately\n  Load(\"Pets.Toys\", Where(\"toys.deleted = ?\", isDeleted)),\n  Load(\"Property\"),\n  Where(\"age \u003e ?\", 23),\n).All(ctx, db)\n```\n\nWe provide the following methods for managing relationships on objects:\n\n**To One**\n\n- `SetX()`: Set the foreign key to point to something else: jet.SetPilot(...)\n- `RemoveX()`: Null out the foreign key, effectively removing the relationship between these two objects: jet.RemovePilot(...)\n\n**To Many**\n\n- `AddX()`: Add more relationships to the existing set of related Xs: pilot.AddLanguages(...)\n- `SetX()`: Remove all existing relationships, and replace them with the provided set: pilot.SetLanguages(...)\n- `RemoveX()`: Remove all provided relationships: pilot.RemoveLanguages(...)\n\n**Important**: Remember to use transactions around these set helpers for performance\nand data integrity. SQLBoiler does not do this automatically due to it's transparent API which allows\nyou to batch any number of calls in a transaction without spawning subtransactions you don't know\nabout or are not supported by your database.\n\n**To One** code examples:\n\n```go\n  jet, _ := models.FindJet(ctx, db, 1)\n  pilot, _ := models.FindPilot(ctx, db, 1)\n\n  // Set the pilot to an existing jet\n  err := jet.SetPilot(ctx, db, false, \u0026pilot)\n\n  pilot = models.Pilot{\n    Name: \"Erlich\",\n  }\n\n  // Insert the pilot into the database and assign it to a jet\n  err := jet.SetPilot(ctx, db, true, \u0026pilot)\n\n  // Remove a relationship. This method only exists for foreign keys that can be NULL.\n  err := jet.RemovePilot(ctx, db, \u0026pilot)\n```\n\n**To Many** code examples:\n\n```go\n  pilots, _ := models.Pilots().All(ctx, db)\n  languages, _ := models.Languages().All(ctx, db)\n\n  // Set a group of language relationships\n  err := pilots.SetLanguages(db, false, \u0026languages)\n\n  languages := []*models.Language{\n    {Language: \"Strayan\"},\n    {Language: \"Yupik\"},\n    {Language: \"Pawnee\"},\n  }\n\n  // Insert new a group of languages and assign them to a pilot\n  err := pilots.SetLanguages(ctx, db, true, languages...)\n\n  // Add another language relationship to the existing set of relationships\n  err := pilots.AddLanguages(ctx, db, false, \u0026someOtherLanguage)\n\n  anotherLanguage := models.Language{Language: \"Archi\"}\n\n  // Insert and then add another language relationship\n  err := pilots.AddLanguages(ctx, db, true, \u0026anotherLanguage)\n\n  // Remove a group of relationships\n  err := pilots.RemoveLanguages(ctx, db, languages...)\n```\n\n### Hooks\n\nBefore and After hooks are available for most operations. If you don't need them you can\nshrink the size of the generated code by disabling them with the `--no-hooks` flag.\n\nEvery generated package that includes hooks has the following `HookPoints` defined:\n\n```go\nconst (\n  BeforeInsertHook HookPoint = iota + 1\n  BeforeUpdateHook\n  BeforeDeleteHook\n  BeforeUpsertHook\n  AfterInsertHook\n  AfterSelectHook\n  AfterUpdateHook\n  AfterDeleteHook\n  AfterUpsertHook\n)\n```\n\nTo register a hook for your model you will need to create the hook function, and attach\nit with the `AddModelHook` method. Here is an example of a before insert hook:\n\n```go\n// Define my hook function\nfunc myHook(ctx context.Context, exec boil.ContextExecutor, p *Pilot) error {\n  // Do stuff\n  return nil\n}\n\n// Register my before insert hook for pilots\nmodels.AddPilotHook(boil.BeforeInsertHook, myHook)\n```\n\nYour `ModelHook` will always be defined as `func(context.Context, boil.ContextExecutor, *Model) error` if context is not turned off.\n\n#### Skipping Hooks\n\nYou can skip hooks by using the `boil.SkipHooks` on the context you pass in\nto a given query.\n\n### Transactions\n\nThe `boil.Executor` and `boil.ContextExecutor` interface powers all of SQLBoiler. This means\nanything that conforms to the three `Exec/Query/QueryRow` methods (and their context-aware variants)\ncan be used to execute queries. `sql.DB`, `sql.Tx` as well as other\nlibraries (`sqlx`) conform to this interface, and therefore any of these things may be\nused as an executor for any query in the system. This makes using transactions very simple:\n\n```go\ntx, err := db.BeginTx(ctx, nil)\nif err != nil {\n  return err\n}\n\nusers, _ := models.Pilots().All(ctx, tx)\nusers.DeleteAll(ctx, tx)\n\n// Rollback or commit\ntx.Commit()\ntx.Rollback()\n```\n\nIt's also worth noting that there's a way to take advantage of `boil.SetDB()`\nby using the\n[boil.BeginTx()](https://pkg.go.dev/github.com/aarondl/sqlboiler/v4/boil#BeginTx)\nfunction. This opens a transaction using the globally stored database.\n\n### Debug Logging\n\nDebug logging will print your generated SQL statement and the arguments it is using.\nDebug logging can be toggled on globally by setting the following global variable to `true`:\n\n```go\nboil.DebugMode = true\n\n// Optionally set the writer as well. Defaults to os.Stdout\nfh, _ := os.Open(\"debug.txt\")\nboil.DebugWriter = fh\n```\n\nNote: Debug output is messy at the moment. This is something we would like addressed.\n\n### Select\n\nSelect is done through [Query Building](#query-building) and [Find](#find). Here's a short example:\n\n```go\n// Select one pilot\npilot, err := models.Pilots(qm.Where(\"name=?\", \"Tim\")).One(ctx, db)\n// Type safe variant\npilot, err := models.Pilots(models.PilotWhere.Name.EQ(\"Tim\")).One(ctx, db)\n\n// Select specific columns of many jets\njets, err := models.Jets(qm.Select(\"age\", \"name\")).All(ctx, db)\n// Type safe variant\njets, err := models.Jets(qm.Select(models.JetColumns.Age, models.JetColumns.Name)).All(ctx, db)\n```\n\n### Find\n\nFind is used to find a single row by primary key:\n\n```go\n// Retrieve pilot with all columns filled\npilot, err := models.FindPilot(ctx, db, 1)\n\n// Retrieve a subset of column values\njet, err := models.FindJet(ctx, db, 1, \"name\", \"color\")\n```\n\n### Insert\n\nThe main thing to be aware of with `Insert` is how the `columns` argument\noperates. You can supply one of the following column lists:\n`boil.Infer`, `boil.Whitelist`, `boil.Blacklist`, or `boil.Greylist`.\n\nThese lists control what fields are inserted into the database, and what values\nare returned to your struct from the database (default, auto incrementing,\ntrigger-based columns are candidates for this). Your struct will have those\nvalues after the insert is complete.\n\nWhen you use inference `sqlboiler` looks at your Go struct field values and if\nthe field value is the Go zero value and that field has a default value in the\ndatabase it will not insert that field, instead it will get the value from the\ndatabase. Keep in mind `sqlboiler` cannot read or understand your default\nvalues set in the database, so the Go zero value is what's important here (this\ncan be especially troubling for default true bool fields). Use a whitelist or\ngreylist in cases where you want to insert a Go zero value.\n\n| Column List | Behavior                                                         |\n| ----------- | ---------------------------------------------------------------- |\n| Infer       | Infer the column list using \"smart\" rules                        |\n| Whitelist   | Insert only the columns specified in this list                   |\n| Blacklist   | Infer the column list, but ensure these columns are not inserted |\n| Greylist    | Infer the column list, but ensure these columns are inserted     |\n\n**NOTE:** CreatedAt/UpdatedAt are not included in `Whitelist` automatically.\n\nSee the documentation for\n[boil.Columns.InsertColumnSet](https://pkg.go.dev/github.com/aarondl/sqlboiler/v4/boil/#Columns.InsertColumnSet)\nfor more details.\n\n```go\nvar p1 models.Pilot\np1.Name = \"Larry\"\nerr := p1.Insert(ctx, db, boil.Infer()) // Insert the first pilot with name \"Larry\"\n// p1 now has an ID field set to 1\n\nvar p2 models.Pilot\np2.Name = \"Boris\"\nerr := p2.Insert(ctx, db, boil.Infer()) // Insert the second pilot with name \"Boris\"\n// p2 now has an ID field set to 2\n\nvar p3 models.Pilot\np3.ID = 25\np3.Name = \"Rupert\"\nerr := p3.Insert(ctx, db, boil.Infer()) // Insert the third pilot with a specific ID\n// The id for this row was inserted as 25 in the database.\n\nvar p4 models.Pilot\np4.ID = 0\np4.Name = \"Nigel\"\nerr := p4.Insert(ctx, db, boil.Whitelist(\"id\", \"name\")) // Insert the fourth pilot with a zero value ID\n// The id for this row was inserted as 0 in the database.\n// Note: We had to use the whitelist for this, otherwise\n// SQLBoiler would presume you wanted to auto-increment\n```\n\n### Update\n\n`Update` can be performed on a single object, a slice of objects or as a [Finisher](#finishers)\nfor a collection of rows.\n\n`Update` on a single object optionally takes a `whitelist`. The purpose of the\nwhitelist is to specify which columns in your object should be updated in the database.\n\nLike `Insert`, this method also takes a `Columns` type, but the behavior is\nslightly different. Although the descriptions below look similar the full\ndocumentation reveals the differences. Note that all inference is based on\nthe Go types zero value and not the database default value, read the `Insert`\ndocumentation above for more details.\n\n| Column List | Behavior                                                                     |\n| ----------- | ---------------------------------------------------------------------------- |\n| Infer       | Infer the column list using \"smart\" rules                                    |\n| Whitelist   | Update only the columns specified in this list                               |\n| Blacklist   | Infer the column list for updating, but ensure these columns are not updated |\n| Greylist    | Infer the column list, but ensure these columns are updated                  |\n\n**NOTE:** CreatedAt/UpdatedAt are not included in `Whitelist` automatically.\n\nSee the documentation for\n[boil.Columns.UpdateColumnSet](https://pkg.go.dev/github.com/aarondl/sqlboiler/v4/boil/#Columns.UpdateColumnSet)\nfor more details.\n\n```go\n// Find a pilot and update his name\npilot, _ := models.FindPilot(ctx, db, 1)\npilot.Name = \"Neo\"\nrowsAff, err := pilot.Update(ctx, db, boil.Infer())\n\n// Update a slice of pilots to have the name \"Smith\"\npilots, _ := models.Pilots().All(ctx, db)\nrowsAff, err := pilots.UpdateAll(ctx, db, models.M{\"name\": \"Smith\"})\n\n// Update all pilots in the database to to have the name \"Smith\"\nrowsAff, err := models.Pilots().UpdateAll(ctx, db, models.M{\"name\": \"Smith\"})\n```\n\n### Delete\n\nDelete a single object, a slice of objects or specific objects through [Query Building](#query-building).\n\n```go\npilot, _ := models.FindPilot(db, 1)\n// Delete the pilot from the database\nrowsAff, err := pilot.Delete(ctx, db)\n\n// Delete all pilots from the database\nrowsAff, err := models.Pilots().DeleteAll(ctx, db)\n\n// Delete a slice of pilots from the database\npilots, _ := models.Pilots().All(ctx, db)\nrowsAff, err := pilots.DeleteAll(ctx, db)\n```\n\n### Upsert\n\n[Upsert](https://www.postgresql.org/docs/9.5/static/sql-insert.html) allows you to perform an insert\nthat optionally performs an update when a conflict is found against existing row values.\n\nThe `updateColumns` and `insertColumns` operates in the same fashion that it does for [Update](#update)\nand [Insert](#insert).\n\nIf an insert is performed, your object will be updated with any missing default values from the database,\nsuch as auto-incrementing column values.\n\n```go\nvar p1 models.Pilot\np1.ID = 5\np1.Name = \"Gaben\"\n\n// INSERT INTO pilots (\"id\", \"name\") VALUES($1, $2)\n// ON CONFLICT DO NOTHING\nerr := p1.Upsert(ctx, db, false, nil, boil.Infer())\n\n// INSERT INTO pilots (\"id\", \"name\") VALUES ($1, $2)\n// ON CONFLICT (\"id\") DO UPDATE SET \"name\" = EXCLUDED.\"name\"\nerr := p1.Upsert(ctx, db, true, []string{\"id\"}, boil.Whitelist(\"name\"), boil.Infer())\n\n// Set p1.ID to a zero value. We will have to use the whitelist now.\np1.ID = 0\np1.Name = \"Hogan\"\n\n// INSERT INTO pilots (\"id\", \"name\") VALUES ($1, $2)\n// ON CONFLICT (\"id\") DO UPDATE SET \"name\" = EXCLUDED.\"name\"\nerr := p1.Upsert(ctx, db, true, []string{\"id\"}, boil.Whitelist(\"name\"), boil.Whitelist(\"id\", \"name\"))\n\n// Custom conflict_target expression:\n// INSERT INTO pilots (\"id\", \"name\") VALUES (9, 'Antwerp Design')\n// ON CONFLICT ON CONSTRAINT pilots_pkey DO NOTHING;\nconflictTarget := models.UpsertConflictTarget\nerr := p1.Upsert(ctx, db, false, nil, boil.Whitelist(\"id\", \"name\"), boil.None(), conflictTarget(\"ON CONSTRAINT pilots_pkey\"))\n\n// Custom UPDATE SET expression:\n// INSERT INTO pilots (\"id\", \"name\") VALUES (9, 'Antwerp Design')\n// ON CONFLICT (\"id\") DO UPDATE SET (id, name) = (sub-SELECT)\nupdateSet := models.UpsertUpdateSet\nerr := p1.Upsert(ctx, db, true, []string{\"id\"}, boil.Whitelist(\"id\", \"name\"), boil.None(), updateSet(\"(id, name) = (sub-SELECT)\"))\n```\n\n- **Postgres**\n  - The `updateOnConflict` argument allows you to specify whether you would like Postgres\n    to perform a `DO NOTHING` on conflict, opposed to a `DO UPDATE`. For MySQL and MSSQL, this param will not be generated.\n  - The `conflictColumns` argument allows you to specify the `ON CONFLICT` columns for Postgres.\n    For MySQL and MSSQL, this param will not be generated.\n- **MySQL and MSSQL**\n  - Passing `boil.None()` for `updateColumns` allows to perform a `DO NOTHING` on conflict similar to Postgres.\n\nNote: Passing a different set of column values to the update component is not currently supported.\n\nNote: Upsert is now not guaranteed to be provided by SQLBoiler and it's now up to each driver\nindividually to support it since it's a bit outside of the reach of the sql standard.\n\n### Reload\n\nIn the event that your objects get out of sync with the database for whatever reason,\nyou can use `Reload` and `ReloadAll` to reload the objects using the primary key values\nattached to the objects.\n\n```go\npilot, _ := models.FindPilot(ctx, db, 1)\n\n// \u003e Object becomes out of sync for some reason, perhaps async processing\n\n// Refresh the object with the latest data from the db\nerr := pilot.Reload(ctx, db)\n\n// Reload all objects in a slice\npilots, _ := models.Pilots().All(ctx, db)\nerr := pilots.ReloadAll(ctx, db)\n```\n\nNote: `Reload` and `ReloadAll` are not recursive, if you need your relationships reloaded\nyou will need to call the `Reload` methods on those yourself.\n\n### Exists\n\n```go\njet, err := models.FindJet(ctx, db, 1)\n\n// Check if the pilot assigned to this jet exists.\nexists, err := jet.Pilot().Exists(ctx, db)\n\n// Check if the pilot with ID 5 exists\nexists, err := models.Pilots(Where(\"id=?\", 5)).Exists(ctx, db)\n```\n\n### Enums\n\nIf your MySQL or Postgres tables use enums we will generate constants that hold their values\nthat you can use in your queries. For example:\n\n```sql\nCREATE TYPE workday AS ENUM('monday', 'tuesday', 'wednesday', 'thursday', 'friday');\n\nCREATE TABLE event_one (\n  id     serial PRIMARY KEY NOT NULL,\n  name   VARCHAR(255),\n  day    workday NOT NULL\n);\n```\n\nAn enum type defined like the above, being used by a table, will generate the following enums:\n\n```go\nconst (\n  WorkdayMonday    = \"monday\"\n  WorkdayTuesday   = \"tuesday\"\n  WorkdayWednesday = \"wednesday\"\n  WorkdayThursday  = \"thursday\"\n  WorkdayFriday    = \"friday\"\n)\n```\n\nFor Postgres we use `enum type name + title cased` value to generate the const variable name.\nFor MySQL we use `table name + column name + title cased value` to generate the const variable name.\n\nNote: If your enum holds a value we cannot parse correctly due, to non-alphabet characters for example,\nit may not be generated. In this event, you will receive errors in your generated tests because\nthe value randomizer in the test suite does not know how to generate valid enum values. You will\nstill be able to use your generated library, and it will still work as expected, but the only way\nto get the tests to pass in this event is to either use a parsable enum value or use a regular column\ninstead of an enum.\n\n### Constants\n\nThe models package will also contain some structs that contain all table,\ncolumn, relationship names harvested from the database at generation time. Type\nsafe where query mods are also generated.\n\nThere are type safe identifiers at:\n\n- models.TableNames.TableName\n- models.ModelColumns.ColumnName\n- models.ModelWhere.ColumnName.Operator\n- models.ModelRels.ForeignTableName\n\nFor table names they're generated under `models.TableNames`:\n\n```go\n// Generated code from models package\nvar TableNames = struct {\n  Messages  string\n  Purchases string\n}{\n  Messages:  \"messages\",\n  Purchases: \"purchases\",\n}\n\n// Usage example:\nfmt.Println(models.TableNames.Messages)\n```\n\nFor column names they're generated under `models.{Model}Columns`:\n\n```go\n// Generated code from models package\nvar MessageColumns = struct {\n  ID         string\n  PurchaseID string\n}{\n  ID:         \"id\",\n  PurchaseID: \"purchase_id\",\n}\n\n// Usage example:\nfmt.Println(models.MessageColumns.ID)\n```\n\nFor where clauses they're generated under `models.{Model}Where.{Column}.{Operator}`:\n\n```go\nvar MessageWhere = struct {\n  ID       whereHelperint\n  Text     whereHelperstring\n}{\n  ID:         whereHelperint{field: `id`},\n  PurchaseID: whereHelperstring{field: `purchase_id`},\n}\n\n// Usage example:\nmodels.Messages(models.MessageWhere.PurchaseID.EQ(\"hello\"))\n```\n\nFor eager loading relationships ther're generated under `models.{Model}Rels`:\n\n```go\n// Generated code from models package\nvar MessageRels = struct {\n  Purchase string\n}{\n  Purchase: \"Purchase\",\n}\n\n// Usage example:\nfmt.Println(models.MessageRels.Purchase)\n```\n\n**NOTE:** You can also assign the ModelWhere or ColumnNames to a variable and\nalthough you probably pay some performance penalty with it sometimes the\nreadability increase is worth it:\n\n```go\ncols := \u0026models.UserColumns\nwhere := \u0026models.UserWhere\n\nu, err := models.Users(where.Name.EQ(\"hello\"), qm.Or(cols.Age + \"=?\", 5))\n```\n\n## FAQ\n\n#### Won't compiling models for a huge database be very slow?\n\nNo, because Go's toolchain - unlike traditional toolchains - makes the compiler do most of the work\ninstead of the linker. This means that when the first `go install` is done it can take\na little bit of time because there is a lot of code that is generated. However, because of this\nwork balance between the compiler and linker in Go, linking to that code afterwards in the subsequent\ncompiles is extremely fast.\n\n#### Missing imports for generated package\n\nThe generated models might import a couple of packages that are not on your system already, so\n`cd` into your generated models directory and type `go get -u -t` to fetch them. You will only need\nto run this command once, not per generation.\n\n#### How should I handle multiple schemas?\n\nIf your database uses multiple schemas you should generate a new package for each of your schemas.\nNote that this only applies to databases that use real, SQL standard schemas (like PostgreSQL), not\nfake schemas (like MySQL).\n\n#### How do I use types.BytesArray for Postgres bytea arrays?\n\nOnly \"escaped format\" is supported for types.BytesArray. This means that your byte slice needs to have\na format of \"\\\\x00\" (4 bytes per byte) opposed to \"\\x00\" (1 byte per byte). This is to maintain compatibility\nwith all Postgres drivers. Example:\n\n`x := types.BytesArray{0: []byte(\"\\\\x68\\\\x69\")}`\n\nPlease note that multi-dimensional Postgres ARRAY types are not supported at this time.\n\n#### Why aren't my time.Time or null.Time fields working in MySQL?\n\nYou _must_ use a DSN flag in MySQL connections, see: [Requirements](#requirements)\n\n#### Where is the homepage?\n\nThe homepage for the [SQLBoiler](https://github.com/aarondl/sqlboiler) [Golang ORM](https://github.com/aarondl/sqlboiler)\ngenerator is located at: https://github.com/aarondl/sqlboiler\n\n#### Why are the auto-generated tests failing?\n\nThe tests generated for your models package with sqlboiler are fairly\nerror-prone. They are usually broken by constraints in the database\nthat sqlboiler can't hope to understand.\n\nDuring regular run-time this isn't an issue because your code will throw errors\nand you will fix it however the auto-generated tests can only report those\nerrors and it seems like something is wrong when in reality the only issue is\nthat the auto generated tests can't understand that your `text` column is\nvalidated by a regex that says it must be composed solely of the 'b' character\nrepeated 342 times.\n\nThese tests are broken especially by foreign key constraints because of the\nparallelism we use. There's also no understanding in the tests of dependencies\nbased on these foreign keys. As such there is a process that removes the foreign\nkeys from your schema when they are run, if this process messes up you will get\nerrors relating to foreign key constraints.\n\n## Benchmarks\n\nIf you'd like to run the benchmarks yourself check out our [boilbench](https://github.com/aarondl/boilbench) repo.\n\n```bash\ngo test -bench . -benchmem\n```\n\n### Results (lower is better)\n\nTest machine:\n\n```text\nOS:  Ubuntu 16.04\nCPU: Intel(R) Core(TM) i7-4771 CPU @ 3.50GHz\nMem: 16GB\nGo:  go version go1.8.1 linux/amd64\n```\n\nThe graphs below have many runs like this as input to calculate errors. Here\nis a sample run:\n\n```text\nBenchmarkGORMSelectAll/gorm-8         20000   66500 ns/op   28998 B/op    455 allocs/op\nBenchmarkGORPSelectAll/gorp-8         50000   31305 ns/op    9141 B/op    318 allocs/op\nBenchmarkXORMSelectAll/xorm-8         20000   66074 ns/op   16317 B/op    417 allocs/op\nBenchmarkKallaxSelectAll/kallax-8    100000   18278 ns/op    7428 B/op    145 allocs/op\nBenchmarkBoilSelectAll/boil-8        100000   12759 ns/op    3145 B/op     67 allocs/op\n\nBenchmarkGORMSelectSubset/gorm-8      20000    69469 ns/op   30008 B/op   462 allocs/op\nBenchmarkGORPSelectSubset/gorp-8      50000    31102 ns/op    9141 B/op   318 allocs/op\nBenchmarkXORMSelectSubset/xorm-8      20000    64151 ns/op   15933 B/op   414 allocs/op\nBenchmarkKallaxSelectSubset/kallax-8 100000    16996 ns/op    6499 B/op   132 allocs/op\nBenchmarkBoilSelectSubset/boil-8     100000    13579 ns/op    3281 B/op    71 allocs/op\n\nBenchmarkGORMSelectComplex/gorm-8     20000    76284 ns/op   34566 B/op   521 allocs/op\nBenchmarkGORPSelectComplex/gorp-8     50000    31886 ns/op    9501 B/op   328 allocs/op\nBenchmarkXORMSelectComplex/xorm-8     20000    68430 ns/op   17694 B/op   464 allocs/op\nBenchmarkKallaxSelectComplex/kallax-8 50000    26095 ns/op   10293 B/op   212 allocs/op\nBenchmarkBoilSelectComplex/boil-8    100000    16403 ns/op    4205 B/op   102 allocs/op\n\nBenchmarkGORMDelete/gorm-8           200000    10356 ns/op    5059 B/op    98 allocs/op\nBenchmarkGORPDelete/gorp-8          1000000     1335 ns/op     352 B/op    13 allocs/op\nBenchmarkXORMDelete/xorm-8           200000    10796 ns/op    4146 B/op   122 allocs/op\nBenchmarkKallaxDelete/kallax-8       300000     5141 ns/op    2241 B/op    48 allocs/op\nBenchmarkBoilDelete/boil-8          2000000      796 ns/op     168 B/op     8 allocs/op\n\nBenchmarkGORMInsert/gorm-8           100000    15238 ns/op    8278 B/op   150 allocs/op\nBenchmarkGORPInsert/gorp-8           300000     4648 ns/op    1616 B/op    38 allocs/op\nBenchmarkXORMInsert/xorm-8           100000    12600 ns/op    6092 B/op   138 allocs/op\nBenchmarkKallaxInsert/kallax-8       100000    15115 ns/op    6003 B/op   126 allocs/op\nBenchmarkBoilInsert/boil-8          1000000     2249 ns/op     984 B/op    23 allocs/op\n\nBenchmarkGORMUpdate/gorm-8           100000    18609 ns/op    9389 B/op   174 allocs/op\nBenchmarkGORPUpdate/gorp-8           500000     3180 ns/op    1536 B/op    35 allocs/op\nBenchmarkXORMUpdate/xorm-8           100000    13149 ns/op    5098 B/op   149 allocs/op\nBenchmarkKallaxUpdate/kallax-8       100000    22880 ns/op   11366 B/op   219 allocs/op\nBenchmarkBoilUpdate/boil-8          1000000     1810 ns/op     936 B/op    18 allocs/op\n\nBenchmarkGORMRawBind/gorm-8           20000    65821 ns/op   30502 B/op   444 allocs/op\nBenchmarkGORPRawBind/gorp-8           50000    31300 ns/op    9141 B/op   318 allocs/op\nBenchmarkXORMRawBind/xorm-8           20000    62024 ns/op   15588 B/op   403 allocs/op\nBenchmarkKallaxRawBind/kallax-8      200000     7843 ns/op    4380 B/op    46 allocs/op\nBenchmarkSQLXRawBind/sqlx-8          100000    13056 ns/op    4572 B/op    55 allocs/op\nBenchmarkBoilRawBind/boil-8          200000    11519 ns/op    4638 B/op    55 allocs/op\n```\n\n\u003cimg src=\"http://i.imgur.com/SltE8UQ.png\"/\u003e\u003cimg src=\"http://i.imgur.com/lzvM5jJ.png\"/\u003e\u003cimg src=\"http://i.imgur.com/SS0zNd2.png\"/\u003e\n\n\u003cimg src=\"http://i.imgur.com/Kk0IM0J.png\"/\u003e\u003cimg src=\"http://i.imgur.com/1IFtpdP.png\"/\u003e\u003cimg src=\"http://i.imgur.com/t6Usecx.png\"/\u003e\n\n\u003cimg src=\"http://i.imgur.com/98DOzcr.png\"/\u003e\u003cimg src=\"http://i.imgur.com/NSp5r4Q.png\"/\u003e\u003cimg src=\"http://i.imgur.com/dEGlOgI.png\"/\u003e\n\n\u003cimg src=\"http://i.imgur.com/W0zhuGb.png\"/\u003e\u003cimg src=\"http://i.imgur.com/YIvDuFv.png\"/\u003e\u003cimg src=\"http://i.imgur.com/sKwuMaU.png\"/\u003e\n\n\u003cimg src=\"http://i.imgur.com/ZUMYVmw.png\"/\u003e\u003cimg src=\"http://i.imgur.com/T61rH3K.png\"/\u003e\u003cimg src=\"http://i.imgur.com/lDr0xhY.png\"/\u003e\n\n\u003cimg src=\"http://i.imgur.com/LWo10M9.png\"/\u003e\u003cimg src=\"http://i.imgur.com/Td15owT.png\"/\u003e\u003cimg src=\"http://i.imgur.com/45XXw4K.png\"/\u003e\n\n\u003cimg src=\"http://i.imgur.com/lpP8qds.png\"/\u003e\u003cimg src=\"http://i.imgur.com/hLyH3jQ.png\"/\u003e\u003cimg src=\"http://i.imgur.com/C2v10t3.png\"/\u003e\n\n## Third-Party Extensions\n\nBelow are extensions for SQL Boiler developed by community, use them at your own risk.\n\n- [sqlboiler-extensions](https://github.com/tiendc/sqlboiler-extensions): Generates additional methods for models, particlarly for bulk operations.\n- [boilingseed](https://github.com/stephenafamo/boilingseed): Generates helpers to seed the database with data.\n- [boilingfactory](https://github.com/stephenafamo/boilingfactory): Generates helpers to create and insert test models on the fly.\n","funding_links":[],"categories":["ORM","Go","\u003ca name=\"Go\"\u003e\u003c/a\u003eGo"],"sub_categories":["HTTP Clients"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Faarondl%2Fsqlboiler","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Faarondl%2Fsqlboiler","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Faarondl%2Fsqlboiler/lists"}