https://github.com/pranavmangal/termq
A simple tool to query LLMs from the terminal
https://github.com/pranavmangal/termq
cerebras chat cli gemini groq llama llm terminal
Last synced: 3 months ago
JSON representation
A simple tool to query LLMs from the terminal
- Host: GitHub
- URL: https://github.com/pranavmangal/termq
- Owner: pranavmangal
- License: mit
- Created: 2024-10-30T01:32:20.000Z (12 months ago)
- Default Branch: master
- Last Pushed: 2025-07-16T03:24:25.000Z (3 months ago)
- Last Synced: 2025-07-17T06:10:46.029Z (3 months ago)
- Topics: cerebras, chat, cli, gemini, groq, llama, llm, terminal
- Language: Go
- Homepage:
- Size: 604 KB
- Stars: 2
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# termq (tq)
[](https://github.com/pranavmangal/termq/releases/latest)
A simple tool to query LLMs from the terminal.

## Installation
Using **Homebrew**:
```bash
brew install pranavmangal/tap/termq
```## Usage
```bash
tq ''
```## Supported Providers
- [Cerebras](https://cerebras.ai/inference) (very fast inference)
- [Groq](https://groq.com/) (fast inference)
- [Google Gemini](https://ai.google.dev/gemini-api)## Configuration
The config file is located at `~/.config/termq/config.toml` (macOS and Linux) or `~\AppData\Roaming\termq\config.toml` (Windows).
`termq` should automatically create a skeletal config for you at first run and ask you to fill in your API keys and preferred models.
Example:
```toml
system_prompt = "You are a helpful assistant."[groq]
model = "llama-3.1-70b-versatile"
api_key = ""
```