https://github.com/rdsq/clai
AI CLIent
https://github.com/rdsq/clai
ai ai-chat ai-client-project cli google-gemini google-generative-ai ollama ollama-client rust semantic-search
Last synced: 4 months ago
JSON representation
AI CLIent
- Host: GitHub
- URL: https://github.com/rdsq/clai
- Owner: rdsq
- License: mit
- Created: 2025-03-26T17:02:35.000Z (7 months ago)
- Default Branch: master
- Last Pushed: 2025-05-26T09:47:01.000Z (4 months ago)
- Last Synced: 2025-05-29T20:26:37.471Z (4 months ago)
- Topics: ai, ai-chat, ai-client-project, cli, google-gemini, google-generative-ai, ollama, ollama-client, rust, semantic-search
- Language: Rust
- Homepage:
- Size: 148 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# clai
A custom **CLIent** *(get it? CLI and client how funny)* for different AI things. Primarily **chatting**, but also some **embedding** functionality
## Interfaces
The format this thing uses for naming models is **interfaces**. It is `provider:model`. Available providers:
- `ollama` *(optional `OLLAMA_HOST` variable (`http://localhost:11434` by default))*
- `google` *(set the `GEMINI_API_KEY` env var with the [API key](https://ai.google.dev/gemini-api/docs/api-key))*
## Installation
Since it is a **Rust** project, it is pretty straightforward
Clone this repository wherever you like, and run
```sh
cargo install --path .
```## Chatting
This is the main functionality of this project, *but not the only one*
### Generate
Simplest one of them, simply **generate** a response from a prompt. So it's something like
```sh
clai gen ollama:gemma3:1b "Hello World"
```You can also attach an image with `--image path/to/image.jpg` if the model supports it
### Chat
This is exactly what you would expect. Open a chat with a chatbot
```sh
clai chat ollama:gemma3:1b
```Chat mode also has **commands**! Type `/help` in chat to see more
### Read (and saves in general)
It also has a feature where you can define an autosave file in `gen` and `chat` using the `--file` option
To read these JSON files, you can use this command
```sh
clai read ./chat.json
```### Model With Model
This is rather for fun, make two AIs talk to one another
You can define them to be the same model, different ones, whatever sounds fun
```sh
clai model-with-model "Hello World" ollama:gemma3:1b google:gemini-2.0-flash
```## Embeddings
These are not about chatbots, they're about **embeddings**
*Basically a thing that evaluates the semantic meaning of strings*
### Semantic Search
So basically, you can do stuff like
```sh
clai semsearch ollama:nomic-embed-text 'fruit' 'apple' 'strawberry' 'banana'
```Or with **files**
```sh
clai semsearch ollama:nomic-embed-text 'nice weather' --input-format file my-posts/*
```Or with **JSON** *(which is for the next feature)*
### Embed
This one is useful as caching **embeddings** for the `semsearch` command. So:
```sh
clai embed ollama:nomic-embed-text 'The weather is nice today' 'Cats are awesome' 'Rust is cool' 'hello world' --output-format json > my-posts.json
```And then:
```sh
cat my-posts.json | clai semsearch ollama:nomic-embed-text -f json 'cats'
```## Planned features
- More interfaces *(not a priority for me personally since I don't use other APIs right now)*
- Option to disable streaming (for better formatting)