https://github.com/baalimago/clai
Command line artificial intelligence - Your local LLM context-feeder
https://github.com/baalimago/clai
ai cli context-feeder go golang-tools llm
Last synced: 3 months ago
JSON representation
Command line artificial intelligence - Your local LLM context-feeder
- Host: GitHub
- URL: https://github.com/baalimago/clai
- Owner: baalimago
- License: mit
- Created: 2024-03-03T12:36:02.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2025-07-07T17:09:55.000Z (3 months ago)
- Last Synced: 2025-07-07T18:22:23.400Z (3 months ago)
- Topics: ai, cli, context-feeder, go, golang-tools, llm
- Language: Go
- Homepage:
- Size: 4.94 MB
- Stars: 92
- Watchers: 1
- Forks: 4
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# clai: command line artificial intelligence
[](https://goreportcard.com/report/github.com/baalimago/clai)
Test coverage: 45.445% ππ
`clai` (/klaΙͺ/, like "cli" in "**cli**mate") is a command line context-feeder for any ai task.
![]()
## Features
- **[MCP client support](./EXAMPLES.md#Tooling)** - Add any MCP server you'd like by simply pasting their configuration.
- **Vendor agnosticism** - Use any functionality in Clai with [most LLM vendors](#supported-vendors) interchangeably.
- **[Conversations](./EXAMPLES.md#Conversations)** - Create, manage and continue conversations.
- **Rate limit circumvention** - Automatically summarize + recall complex tasks.
- **[Profiles](./EXAMPLES.md#Profiles)** - Pre-prompted profiles enabling customized workflows and agents.
- **Unix-like** - Clai follows the [unix philosophy](https://en.wikipedia.org/wiki/Unix_philosophy) and works seamlessly with data piped in and out.All of these features are easily combined and tweaked, empowering users to accomplish very diverse use cases.
See [examples](./EXAMPLES.md) for additional info.## Supported vendors
| Vendor | Environment Variable | Models |
| --------- | -------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------- |
| OpenAI | `OPENAI_API_KEY` | [Text models](https://platform.openai.com/docs/models/gpt-4-and-gpt-4-turbo), [photo models](https://platform.openai.com/docs/models/dall-e) |
| Anthropic | `ANTHROPIC_API_KEY` | [Text models](https://docs.anthropic.com/claude/docs/models-overview#model-recommendations) |
| Mistral | `MISTRAL_API_KEY` | [Text models](https://docs.mistral.ai/getting-started/models/) |
| Deepseek | `DEEPSEEK_API_KEY` | [Text models](https://api-docs.deepseek.com/quick_start/pricing) |
| Novita AI | `NOVITA_API_KEY` | [Text models](https://novita.ai/model-api/product/llm-api?utm_source=github_clai&utm_medium=github_readme&utm_campaign=link), use prefix `novita:` |
| Ollama | N/A | Use format `ollama:` (defaults to llama3), server defaults to localhost:11434 |
| Inception | `INCEPTION_API_KEY` | [Text models](https://platform.inceptionlabs.ai/docs#models) |Note that you can only use the models that you have bought an API key for.
## Get started
```bash
go install github.com/baalimago/clai@latest
```You may also use the setup script:
```bash
curl -fsSL https://raw.githubusercontent.com/baalimago/clai/main/setup.sh | sh
```Either look at `clai help` or the [examples](./EXAMPLES.md) for how to use `clai`.
If you have time, you can also check out [this blogpost](https://lorentz.app/blog-item.html?id=clai) for a slightly more structured introduction on how to use Clai efficiently.Install [Glow](https://github.com/charmbracelet/glow) for formatted markdown output when querying text responses.