https://github.com/spignelon/nollama
NoLlama is a lightweight terminal-based alternative to Ollama, enabling interaction with large language models like GPT-4o and LLaMA 70B directly from your terminal. It offers a sleek UI, multiple model choices, colorful markdown rendering, and low memory usage, all without needing a browser.
https://github.com/spignelon/nollama
chatbot chatgpt claude g4f gpt4 gpt4all gpt4free llama3 llm llms mistral mixtral ollama python terminal-ai
Last synced: 3 days ago
JSON representation
NoLlama is a lightweight terminal-based alternative to Ollama, enabling interaction with large language models like GPT-4o and LLaMA 70B directly from your terminal. It offers a sleek UI, multiple model choices, colorful markdown rendering, and low memory usage, all without needing a browser.
- Host: GitHub
- URL: https://github.com/spignelon/nollama
- Owner: spignelon
- License: gpl-3.0
- Created: 2024-08-23T06:02:13.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2024-11-05T16:24:17.000Z (11 months ago)
- Last Synced: 2025-01-16T04:51:18.088Z (9 months ago)
- Topics: chatbot, chatgpt, claude, g4f, gpt4, gpt4all, gpt4free, llama3, llm, llms, mistral, mixtral, ollama, python, terminal-ai
- Language: Python
- Homepage: https://pypi.org/project/nollama/
- Size: 38.1 KB
- Stars: 4
- Watchers: 2
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# NoLlama
NoLlama is a terminal-based interface for interacting with Google's Gemini API directly from your terminal. Inspired by [Ollama](https://ollama.com/), NoLlama provides a streamlined experience for chatting with Gemini models like Gemini 2.0 Flash, Gemini 2.5 Flash Preview, and Gemini 2.5 Pro Preview, etc. **Ollama, Groq and OpenRouter support will be added soon**.
NoLlama offers a neat terminal interface for powerful language models that aren't easily available for local execution, complete with colorful markdown rendering, multiple model choices, and efficient memory usage.

## Features
- **Google Gemini Models:** Access to powerful models like Gemini 2.0 Flash, Gemini 2.5 Flash Preview, and Gemini 2.5 Pro Preview.
- **Multi-turn Conversations:** Maintain context between prompts for more coherent conversations.
- **Neat Terminal UI:** Enjoy a clean and intuitive interface for your interactions.
- **Live Streaming Responses:** Watch responses appear in real-time as they're generated.
- **Colorful Markdown Rendering:** Rich text formatting and syntax highlighting in your terminal.
- **Low Memory Usage:** Efficient memory management makes it lightweight compared to using a browser.
- **Easy Model Switching:** Simply type `model` in the chat to switch between models.
- **Clear Chat History:** Type `clear` to clear the chat history.
- **Exit Commands:** Type `q`, `quit`, or `exit` to leave the chat, or use Ctrl+C or Ctrl+D.## Setup
1. **API Key Configuration:**
Create a `.nollama` file in your home directory with your Gemini API key:
```bash
echo "GEMINI=your_api_key_here" > ~/.nollama
```
You can get a free API key from [Google AI Studio](https://aistudio.google.com/).2. **Installation:**
a. Install from PyPI (recommended):
```bash
pip install nollama
```b. Or clone and install from source:
```bash
git clone https://github.com/spignelon/nollama.git
cd nollama
pip install -e .
```3. **Run NoLlama:**
```bash
nollama
```## Usage
- **Select a Model:** At startup, choose from available Gemini models.
- **Chat Normally:** Type your questions and see the responses with rich formatting.
- **Switch Models:** Type `model` in the chat to choose a different model.
- **Clear Chat:** Type `clear` to clear the chat history.
- **Exit:** Type `q`, `quit`, or `exit` to leave the chat, or press Ctrl+C or Ctrl+D.# Todos
- [x] Add context window
- [ ] Web interface
- [ ] Add support for Groq
- [ ] Add support for OpenRouter
- [ ] Add support for Ollama API
- [ ] Support for custom APIs## Contribution
Contributions are welcome! If you have suggestions for new features or improvements, feel free to open an issue or submit a pull request.
## Disclaimer
NoLlama is not affiliated with Ollama. It is an independent project inspired by the concept of providing a neat terminal interface for interacting with language models.
## License
This project is licensed under the [GPL-3.0 License](LICENSE).
[](https://www.gnu.org/licenses/gpl-3.0.en.html)