{"id":13453003,"url":"https://github.com/pythops/tenere","last_synced_at":"2025-10-23T02:43:19.796Z","repository":{"id":153234341,"uuid":"624156003","full_name":"pythops/tenere","owner":"pythops","description":"🤖 TUI interface for LLMs written in Rust","archived":false,"fork":false,"pushed_at":"2025-01-08T08:11:57.000Z","size":657,"stargazers_count":501,"open_issues_count":14,"forks_count":22,"subscribers_count":9,"default_branch":"master","last_synced_at":"2025-04-06T14:01:44.066Z","etag":null,"topics":["chatgpt","cli","llamacpp","llm","ollama","ratatui","rust","tui"],"latest_commit_sha":null,"homepage":"https://crates.io/crates/tenere","language":"Rust","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"gpl-3.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/pythops.png","metadata":{"files":{"readme":"README.md","changelog":"CHANGELOG.md","contributing":null,"funding":".github/FUNDING.yml","license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null},"funding":{"github":"pythops"}},"created_at":"2023-04-05T21:35:03.000Z","updated_at":"2025-04-03T17:46:55.000Z","dependencies_parsed_at":"2024-02-07T16:29:02.224Z","dependency_job_id":"3e56949c-0222-4743-8164-376e9add4be0","html_url":"https://github.com/pythops/tenere","commit_stats":{"total_commits":188,"total_committers":10,"mean_commits":18.8,"dds":0.07978723404255317,"last_synced_commit":"61f90dba05bc51a12f1e02ef6d9b2812137d5ad5"},"previous_names":[],"tags_count":13,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pythops%2Ftenere","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pythops%2Ftenere/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pythops%2Ftenere/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pythops%2Ftenere/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/pythops","download_url":"https://codeload.github.com/pythops/tenere/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":248748593,"owners_count":21155663,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["chatgpt","cli","llamacpp","llm","ollama","ratatui","rust","tui"],"created_at":"2024-07-31T08:00:30.564Z","updated_at":"2025-10-23T02:43:19.722Z","avatar_url":"https://github.com/pythops.png","language":"Rust","readme":"\u003cdiv align=\"center\"\u003e\n  \u003ch1\u003e Tenere \u003c/h1\u003e\n  \u003cimg src=\"assets/logo.png\" alt=\"A crab in the moroccan desert\"\u003e\u003c/img\u003e\n  \u003ch2\u003e TUI interface for LLMs written in Rust \u003c/h2\u003e\n\u003c/div\u003e\n\n## 📸 Demo\n\n![Demo](https://github.com/pythops/tenere/assets/57548585/b33ed59b-1d94-4bc5-8e61-73e63f41137e)\n\n\u003cbr\u003e\n\n## 🪄 Features\n\n- Syntax highlights\n- Chat history\n- Save chats to files\n- Vim keybinding (most common ops)\n- Copy text from/to clipboard (works only on the prompt)\n- Multiple backends\n- Automatically load the last saved chat into history\n\n\u003cbr\u003e\n\n## 💎 Supported Backends\n\n- [x] ChatGPT\n- [x] llama.cpp\n- [x] ollama\n\n\u003cbr\u003e\n\n## 🚀 Installation\n\n\u003ca href=\"https://repology.org/project/tenere/versions\"\u003e\n    \u003cimg src=\"https://repology.org/badge/vertical-allrepos/tenere.svg\" alt=\"Packaging status\" align=\"left\"\u003e\n\u003c/a\u003e\n\n\u003cbr\u003e\n\u003cbr\u003e\n\u003cbr\u003e\n\u003cbr\u003e\n\u003cbr\u003e\n\n### 📥 Binary releases\n\nYou can download the pre-built binaries from the [release page](https://github.com/pythops/tenere/releases)\n\n### 📦 crates.io\n\n`tenere` can be installed from [crates.io](https://crates.io/crates/tenere)\n\n```shell\ncargo install tenere\n```\n\n### ❄️ NixOS / Nix\n\nTenere is available in nixpkgs and can be installed via configuration.nix:\n\n```nix\nenvironment.systemPackages = with pkgs; [\n  tenere\n];\n```\nFor non-NixOS systems, install directly with:\n```nix\nnix-env -iA nixpkgs.tenere\n```\n\n### 📱 Mobile (nix-on-droid)\n\nTenere works on Android via nix-on-droid ([demo](https://github.com/user-attachments/assets/c06e5650-0b5d-4f0a-816d-a2c1bd88774a)).\n\nTo set up ([tutorial](https://www.youtube.com/watch?v=XiVz2UR9epE)):\n\n1. Install nix-on-droid from F-Droid\n2. Add tenere to your packages in \".config/nixpkgs/nix-on-droid.nix\":\n3. Run ``nix-on-droid switch``\n4. Create your config at \".config/tenere/config.toml\"\n\n### 🍺 Homebrew\n\n```\nbrew install tenere\n```\n\n### ⚒️ Build from source\n\nTo build from the source, you need [Rust](https://www.rust-lang.org/) compiler and\n[Cargo package manager](https://doc.rust-lang.org/cargo/).\n\nOnce Rust and Cargo are installed, run the following command to build:\n\n```shell\ncargo build --release\n```\n\nThis will produce an executable file at `target/release/tenere` that you can copy to a directory in your `$PATH`.\n\n\u003cbr\u003e\n\n## ⚙️ Configuration\n\nTenere can be configured using a TOML configuration file. By default, the configuration file is located at:\n\n- **Linux**: `$HOME/.config/tenere/config.toml` or `$XDG_CONFIG_HOME/tenere/config.toml`\n- **Mac**: `$HOME/Library/Application Support/tenere/config.toml`\n- **Windows**: `~/AppData/Roaming/tenere/config.toml`\n\n### 🛠 Custom Configuration Path\n\nYou can optionally specify a custom path for the configuration file using the `-c` flag. This allows you to override the default configuration file location.\n\n### Example Usage\n\n```sh\n# Use the default configuration path\ntenere\n\n# Specify a custom configuration path\ntenere -c ~/path/to/custom/config.toml\n```\n\n### General settings\n\nHere are the available general settings:\n\n- `llm`: the llm model name. Possible values are:\n  - `chatgpt`\n  - `llamacpp`\n  - `ollama`\n\n```toml\nllm  = \"chatgpt\"\n```\n\n### Key bindings\n\nTenere supports customizable key bindings.\nYou can modify some of the default key bindings by updating the `[key_bindings]` section in the configuration file.\nHere is an example with the default key bindings\n\n```toml\n[key_bindings]\nshow_help = '?'\nshow_history = 'h'\nnew_chat = 'n'\n```\n\nℹ️ Note\n\n\u003e To avoid overlapping with vim key bindings, you need to use `ctrl` + `key` except for help `?`.\n\n## Chatgpt\n\nTo use `chatgpt` as the backend, you'll need to provide an API key for OpenAI. There are two ways to do this:\n\nSet an environment variable with your API key:\n\n```shell\nexport OPENAI_API_KEY=\"YOUTR KEY HERE\"\n```\n\nOr\n\nInclude your API key in the configuration file:\n\n```toml\n[chatgpt]\nopenai_api_key = \"Your API key here\"\nmodel = \"gpt-3.5-turbo\"\nurl = \"https://api.openai.com/v1/chat/completions\"\n```\n\nThe default model is set to `gpt-3.5-turbo`. Check out the [OpenAI documentation](https://platform.openai.com/docs/models/gpt-3-5) for more info.\n\n## llama.cpp\n\nTo use `llama.cpp` as the backend, you'll need to provide the url that points to the server :\n\n```toml\n[llamacpp]\nurl = \"http://localhost:8080/v1/chat/completions\"\n```\n\nIf you configure the server with an api key, then you need to provide it as well:\n\nSetting an environment variable :\n\n```shell\nexport LLAMACPP_API_KEY=\"YOUTR KEY HERE\"\n```\n\nOr\n\nInclude your API key in the configuration file:\n\n```toml\n[llamacpp]\nurl = \"http://localhost:8080/v1/chat/completions\"\napi_key = \"Your API Key here\"\n```\n\nMore infos about llama.cpp api [here](https://github.com/ggerganov/llama.cpp/blob/master/examples/server/README.md)\n\n## Ollama\n\nTo use `ollama` as the backend, you'll need to provide the url that points to the server with the model name :\n\n```toml\n[ollama]\nurl = \"http://localhost:11434/api/chat\"\nmodel = \"Your model name here\"\n```\n\nMore infos about ollama api [here](https://github.com/ollama/ollama/blob/main/docs/api.md#generate-a-chat-completion)\n\n\u003cbr\u003e\n\n## ⌨️ Key bindings\n\n### Global\n\nThese are the default key bindings regardless of the focused block.\n\n`ctrl + n`: Start a new chat and save the previous one in history and save it to `tenere.archive-i` file in `data directory`.\n\n`Tab`: Switch the focus.\n\n`j` or `Down arrow key`: Scroll down\n\n`k` or `Up arrow key`: Scroll up\n\n`ctrl + h` : Show chat history. Press `Esc` to dismiss it.\n\n`ctrl + t` : Stop the stream response\n\n`q` or `ctrl + c`: Quit the app\n\n`?`: Show the help pop-up. Press `Esc` to dismiss it\n\n### Prompt\n\nThere are 3 modes like vim: `Normal`, `Visual` and `Insert`.\n\n#### Insert mode\n\n`Esc`: to switch back to Normal mode.\n\n`Enter`: to create a new line.\n\n`Backspace`: to remove the previous character.\n\n#### Normal mode\n\n`Enter`: to submit the prompt\n\n\u003cbr\u003e\n\n`h or Left`: Move the cursor backward by one char.\n\n`j or Down`: Move the cursor down.\n\n`k or Up`: Move the cursor up.\n\n`l or Right`: Move the cursor forward by one char.\n\n`w`: Move the cursor right by one word.\n\n`b`: Move the cursor backward by one word.\n\n`0`: Move the cursor to the start of the line.\n\n`$`: Move the cursor to the end of the line.\n\n`G`: Go to the end.\n\n`gg`: Go to the top.\n\n\u003cbr\u003e\n\n`a`: Insert after the cursor.\n\n`A`: Insert at the end of the line.\n\n`i`: Insert before the cursor.\n\n`I`: Insert at the beginning of the line.\n\n`o`: Append a new line below the current line.\n\n`O`: Append a new line above the current line.\n\n\u003cbr\u003e\n\n`x`: Delete one char under to the cursor.\n\n`dd`: Cut the current line\n\n`D`: Delete the current line and\n\n`dw`: Delete the word next to the cursor.\n\n`db`: Delete the word on the left of the cursor.\n\n`d0`: Delete from the cursor to the beginning of the line.\n\n`d$`: Delete from the cursor to the end of the line.\n\n\u003cbr\u003e\n\n`C`: Change to the end of the line.\n\n`cc`: Change the current line.\n\n`c0`: Change from the cursor to the beginning of the line.\n\n`c$`: Change from the cursor to the end of the line.\n\n`cw`: Change the next word.\n\n`cb`: Change the word on the left of the cursor.\n\n\u003cbr\u003e\n\n`u`: Undo\n\n`p`: Paste\n\n#### Visual mode\n\n`v`: Switch to visual.\n\n`y`: Yank the selected text\n\n\u003cbr\u003e\n\n## ⚖️ License\n\nGNU General Public License v3.0 or later\n","funding_links":["https://github.com/sponsors/pythops"],"categories":["CLI tools","Projects","Chat UIs","Rust","HarmonyOS","CLIs","cli","💻 Apps","Open-Source Local LLM Projects","\u003ca name=\"ai\"\u003e\u003c/a\u003eAI / ChatGPT","Table of Contents"],"sub_categories":["Examples","Windows Manager","⌨️ Development Tools"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fpythops%2Ftenere","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fpythops%2Ftenere","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fpythops%2Ftenere/lists"}