{"id":17919392,"url":"https://github.com/eliben/gemini-cli","last_synced_at":"2025-07-14T17:11:00.162Z","repository":{"id":223493494,"uuid":"753614153","full_name":"eliben/gemini-cli","owner":"eliben","description":"Access Gemini LLMs from the command-line","archived":true,"fork":false,"pushed_at":"2025-06-26T13:25:19.000Z","size":347,"stargazers_count":146,"open_issues_count":1,"forks_count":12,"subscribers_count":6,"default_branch":"main","last_synced_at":"2025-07-08T00:22:10.392Z","etag":null,"topics":["cli","embeddings","go","golang","llm","machine-learning","sqlite"],"latest_commit_sha":null,"homepage":"","language":"Go","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"unlicense","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/eliben.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2024-02-06T13:24:18.000Z","updated_at":"2025-07-07T09:15:35.000Z","dependencies_parsed_at":null,"dependency_job_id":"ab112ad6-83ed-4bea-b536-d76b8534ac5e","html_url":"https://github.com/eliben/gemini-cli","commit_stats":null,"previous_names":["eliben/gemini-cli"],"tags_count":10,"template":false,"template_full_name":null,"purl":"pkg:github/eliben/gemini-cli","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/eliben%2Fgemini-cli","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/eliben%2Fgemini-cli/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/eliben%2Fgemini-cli/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/eliben%2Fgemini-cli/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/eliben","download_url":"https://codeload.github.com/eliben/gemini-cli/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/eliben%2Fgemini-cli/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":265322508,"owners_count":23746630,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["cli","embeddings","go","golang","llm","machine-learning","sqlite"],"created_at":"2024-10-28T20:16:27.376Z","updated_at":"2025-07-14T17:11:00.142Z","avatar_url":"https://github.com/eliben.png","language":"Go","readme":"\u003e [!WARNING]\n\u003e This project is archived; the Gemini SDKs it's using have been deprecated\n\u003e in 2025 and replaced by a new unified Go Gemini SDK. Moreover, Google\n\u003e released [Gemini CLI](https://github.com/google-gemini/gemini-cli), which\n\u003e uses the same name.\n\u003e\n\u003e Please use alternatives like the [llm tool](https://github.com/simonw/llm).\n\n# gemini-cli\n\n`gemini-cli` is a simple yet versatile command-line interface for Google's\nGemini LLMs, written in Go. It includes tools for chatting with these models and\ngenerating / comparing embeddings, with powerful SQLite storage and analysis\ncapabilities.\n\n## Installing\n\nInstall `gemini-cli` on your machine with:\n\n```\n$ go install github.com/eliben/gemini-cli@latest\n```\n\nYou can then invoke `gemini-cli help` to verify it's properly installed and\nfound.\n\n## Usage\n\nAll `gemini-cli` invocations require an API key for https://ai.google.dev/ to be\nprovided, either via the `--key` flag or an environment variable called\n`GEMINI_API_KEY`. You can visit that page to obtain a key - there's a generous\nfree tier!\n\nFrom here on, all examples assume the environment variable was set earlier to a\nvalid key.\n\n`gemini-cli` has a nested tree of subcommands to perform various tasks. You can\nalways run `gemini-cli help \u003ccommand\u003e [subcommand]...` to get usage; e.g.\n`gemini-cli help chat` or `gemini-cli help embed similar`. The printed help\ninformation will describe every subcommand and its flags.\n\nThis guide will discuss some of the more common use cases.\n\n### Models\n\nThe list of Gemini models supported by the backend is available on [this\npage](https://ai.google.dev/models/gemini). You can run `gemini-cli models` to\nask the tool to print a list of model names it's familiar with. These are the\nnames you can pass in with the `--model` flag (see the default model name by\nrunning `gemini-cli help`), and you can always omit the `models/` prefix.\n\n### `prompt` - single prompts\n\nThe `prompt` command allows one to send queries consisting of text or images to\nthe LLM. This is a single-shot interaction; the LLM will have no memory of\nprevious prompts (see the `chat` command for with-memory interactions).\n\n```\n$ gemini-cli prompt \u003cprompt or '-'\u003e... [flags]\n```\n\nThe prompt can be provided as a sequence of parts, each one a command-line\nargument.\n\nThe arguments are sent as a sequence to the model in the order provided.\nIf `--system` is provided, it's prepended to the other arguments. An argument\ncan be some quoted text, a name of an image file on the local filesystem or\na URL pointing directly to an image file online. A special argument with\nthe value `-` instructs the tool to read this prompt part from standard input.\nIt can only appear once in a single invocation.\n\nSome examples:\n\n```\n# Simple single prompt\n$ gemini-cli prompt \"why is the sky blue?\"\n\n# Multi-modal prompt with image file. Note that we have to ask for a\n# vision-capable model explicitly\n$ gemini-cli prompt --model gemini-pro-vision \"describe this image:\" test/datafiles/puppies.png\n```\n\n### `chat` - in-terminal chat with a model\n\nRunning `gemini-cli chat` starts an interactive terminal chat with a model. You\nwrite prompts following the `\u003e` character and the model prints its replies. In\nthis mode, the model has a memory of your previous prompts and its own replies\n(within the model's context length limit). Example:\n\n```\n$ gemini-cli chat\nChatting with gemini-1.5-flash\nType 'exit' or 'quit' to exit\n\u003e name 3 dog breeds\n1. Golden Retriever\n2. Labrador Retriever\n3. German Shepherd\n\u003e which of these is the heaviest?\nGerman Shepherd\n\nGerman Shepherds are typically the heaviest of the three breeds, with males\n[...]\n\u003e and which are considered best for kids?\n**Golden Retrievers** and **Labrador Retrievers** are both considered excellent\n[...]\n\u003e \n```\n\nDuring the chat, it's possible to ask `gemini-cli` to load a file's contents\nto the model instead of sending a textual message; Do this with the\n`$load \u003cpath\u003e` command, pointing to an existing file.\n\n### `counttok` - counting tokens\n\nWe can ask the Gemini API to count the number of tokens in a given prompt or\nlist of prompts. `gemini-cli` supports this with the `counttok` command.\nExamples:\n\n```\n$ gemini-cli counttok \"why is the sky blue?\"\n\n$ cat textfile.txt | gemini-cli counttok -\n```\n\n### Embeddings\n\nSome of `gemini-cli`'s most advanced capabilities are in interacting with\nGemini's embedding models. `gemini-cli` uses SQLite to store embeddings for a\npotentially large number of inputs and query these embeddings for similarity.\nThis is all done through subcommands of the `embed` command.\n\n#### `embed content` - embedding a single piece of content\n\nUseful for kicking the tires of embeddings, this subcommand embeds a single\nprompt taken from the command-line or a file, and prints out its embedding in\nvarious formats (controlled with the `--format` flag). Examples:\n\n```\n$ gemini-cli embed content \"why is the sky blue?\"\n\n$ cat textfile.txt | gemini-cli embed content -\n```\n\n#### `embed db` - embedding multiple contents, storing results in a DB\n\n`embed db` is a swiss-army knife subcommand for embedding multiple pieces of\ntext and storing the results in a SQLite DB. It supports different kinds of\ninputs: a textual table, the file system or the DB itself.\n\nAll variations of `embed db` take the path of a DB file to use as output. If the\nfile exists, it's expected to be a valid SQLite DB; otherwise, a new DB is\ncreated in that path. `gemini-cli` will store the results of embedding\ncalculations in this DB in the `embeddings` table (this name can be configured\nwith the `--table` flag), with this SQL schema:\n\n```\nid TEXT PRIMARY KEY\nembedding BLOB\n```\n\nThe `id` is taken from the input, based on its type. We'll go through the\ndifferent variants of input next.\n\n**Filesystem input**: when passed the `--files` or `--files-list` flag,\n`gemini-cli` takes inputs as files from the filesystem. Each file is one input:\nits path is the ID, and its contents are passed to the embedding model.\n\nWith `--files`, the flag value is a comma-separated pair of\n`\u003croot directory\u003e,\u003cglob pattern\u003e`; the root directory is walked recursively\nand every file matching the glob pattern is included in the input. For example:\n\n```\n$ gemini-cli embed db out.db --files somedir,*.txt\n```\n\nEmbeds every `.txt` file found in `somedir` or any of its sub-directories. The\nID for each file will be its path starting with `somedir/`.\n\nWith `--files-list`, the flag value is a comma-separated pair of filenames. Each\nname becomes an ID and the file's contents are passed to the embedding model.\nThis can be useful for more sophisticated patterns that are difficult to express\nusing a simple glob; for example, using [pss](https://github.com/eliben/pss/)\nand the `paste` command, this embeds any file that looks like a C++ file (i.e.\nending with `.h`, `.hpp`, `.cpp`, `.cxx` and so on) in the current directory:\n\n```\n$ gemini-cli embed db out-db --files-list $(pss -f --cpp | paste -sd,)\n```\n\n**SQLite DB input**: when passed the `--sql` flag, `gemini-cli` takes inputs\nfrom the SQLite DB itself, or any other SQLite DB file. The flag value is a SQL\n`select` statement that should select at least two columns; the first one will\nbe taken as the ID, and the others are concatenated to become the value passed\nto the embedding model.\n\nFor example, if `out.db` already has a table named `docs` with the column names\n`id` and `content`, this call will embed the contents of each row and place the\noutput in the `embeddings` table:\n\n```\n$ gemini-cli embed db out.db --sql \"select id, content from docs\"\n```\n\nWith the `--attach` flag, we can also ask `gemini-cli` to read inputs from other\nSQLite DB files. For example:\n\n```\n$ gemini-cli embed db out.db --attach inp,input.db --sql \"select id, content from inp.docs\"\n```\n\nWill read the inputs from `input.db` and write embedding outputs to `out.db`.\n\n**Tabular input**: without additional flags, `gemini-cli` will expect a filename\nor `-` following the output DB name. This file (or data piped from standard\ninput in case of `-`) is expected to be in either CSV, TSV (tab-separated\nvalues), JSON or [JSONLines](https://jsonlines.org/) format and include a list\nof records that has an ID field and some arbitrary number of other fields that\nare all concatenated to create the content for the record. The content is\nembedded and the result is associated with the ID in the output SQLite DB.\n\nFor example:\n\n```\n$ cat input.csv\nid,name,age\n3,luci,23\n4,merene,29\n5,pat,52\n$ cat input.csv | gemini-cli embed db out.db -\n```\n\nWill embed each record from the input file and create 3 rows in the `embeddings`\ntable associated with the IDs 3, 4 and 5. In this mode, `gemini-cli`\nauto-detects the format of the file passed into it without relying on its\nextension (note that it's unaware of the extension when the input is piped\nthrough standard input).\n\n**Other flags**: `embed db` has some additional flags that affect its behavior\nfor all input modes. Run `gemini help embed db` details.\n\n#### `embed similar` - finding similar items from an embeddings table\n\nOnce an `embeddings` table was computed with `embed db`, we can use the `embed\nsimilar` command to find values that are most similar (in terms of distance in\nembedding vector space) to some content. For example:\n\n```\n$ gemini-cli embed similar out.db somefile.txt\n```\n\nWill embed the contents of `somefile.txt`, then compare its embedding vector\nwith the embeddings stored in the `embeddings` table of `out.db`, and print out\nthe 5 closest entries (this number can be controlled with the `--topk` flag).\n\nBy default, `embed similar` will emit the ID of the similar entry and the\nsimilarity score for each record. The `--show` flag can be used to control which\ncolumns from the DB are printed out.\n\n## Acknowledgements\n\n`gemini-cli` is inspired by Simon Willison's [llm tool](https://llm.datasette.io/en/stable/), but\naimed at the Go ecosystem. [Simon's website](https://simonwillison.net/) is a treasure trove of\ninformation about LLMs, embeddings and building tools that use them - check it out!\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Feliben%2Fgemini-cli","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Feliben%2Fgemini-cli","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Feliben%2Fgemini-cli/lists"}